Going forward, AI algorithms will be incorporated into more and more everyday applications. For example, you might want to include an image classifier in a smart phone app. To do this, you'd use a deep learning model trained on hundreds of thousands of images as part of the overall application architecture. A large part of software development in the future will be using these types of models as common parts of applications.
In this project, you'll train an image classifier to recognize different species of flowers. You can imagine using something like this in a phone app that tells you the name of the flower your camera is looking at. In practice you'd train this classifier, then export it for use in your application. We'll be using this dataset of 102 flower categories, you can see a few examples below.

The project is broken down into multiple steps:
We'll lead you through each part which you'll implement in Python.
When you've completed this project, you'll have an application that can be trained on any set of labeled images. Here your network will be learning about flowers and end up as a command line application. But, what you do with your new skills depends on your imagination and effort in building a dataset. For example, imagine an app where you take a picture of a car, it tells you what the make and model is, then looks up information about it. Go build your own dataset and make something new.
First up is importing the packages you'll need. It's good practice to keep all the imports at the beginning of your code. As you work through this notebook and find you need to import a package, make sure to add the import up here.
# Imports here
%matplotlib inline
%config InlineBackend.figure_format = 'retina'
from collections import OrderedDict
import matplotlib.pyplot as plt
import numpy as np
import torch
from torch import nn
from torch import optim
import torch.nn.functional as F
from torchvision import datasets, transforms, models
Here you'll use torchvision to load the data (documentation). The data should be included alongside this notebook, otherwise you can download it here. The dataset is split into three parts, training, validation, and testing. For the training, you'll want to apply transformations such as random scaling, cropping, and flipping. This will help the network generalize leading to better performance. You'll also need to make sure the input data is resized to 224x224 pixels as required by the pre-trained networks.
The validation and testing sets are used to measure the model's performance on data it hasn't seen yet. For this you don't want any scaling or rotation transformations, but you'll need to resize then crop the images to the appropriate size.
The pre-trained networks you'll use were trained on the ImageNet dataset where each color channel was normalized separately. For all three sets you'll need to normalize the means and standard deviations of the images to what the network expects. For the means, it's [0.485, 0.456, 0.406] and for the standard deviations [0.229, 0.224, 0.225], calculated from the ImageNet images. These values will shift each color channel to be centered at 0 and range from -1 to 1.
data_dir = 'data_images'
data_groups = ['train', 'test', 'valid']
data_dirs = {
'train': data_dir + '/train',
'test': data_dir + '/test',
'valid': data_dir + '/valid'
}
# TODO: Define your transforms for the training, validation, and testing sets
data_transforms = {
'train': transforms.Compose([
transforms.RandomRotation(30),
transforms.Resize(255),
transforms.RandomResizedCrop(224),
transforms.ToTensor(),
transforms.Normalize(mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225])
]),
'test': transforms.Compose([
transforms.Resize(255),
transforms.CenterCrop(224),
transforms.ToTensor(),
transforms.Normalize(mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225])
]),
'valid': transforms.Compose([
transforms.Resize(255),
transforms.CenterCrop(224),
transforms.ToTensor(),
transforms.Normalize(mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225])
])
}
# TODO: Load the datasets with ImageFolder
image_datasets = {x: datasets.ImageFolder(data_dirs[x], transform=data_transforms[x]) for x in data_groups}
# TODO: Using the image datasets and the trainforms, define the dataloaders
dataloaders = {x: torch.utils.data.DataLoader(image_datasets[x], batch_size=32, shuffle=True) for x in data_groups}
You'll also need to load in a mapping from category label to category name. You can find this in the file cat_to_name.json. It's a JSON object which you can read in with the json module. This will give you a dictionary mapping the integer encoded categories to the actual names of the flowers.
import json
with open('cat_to_name.json', 'r') as f:
cat_to_name = json.load(f)
Now that the data is ready, it's time to build and train the classifier. As usual, you should use one of the pretrained models from torchvision.models to get the image features. Build and train a new feed-forward classifier using those features.
We're going to leave this part up to you. Refer to the rubric for guidance on successfully completing this section. Things you'll need to do:
We've left a cell open for you below, but use as many as you need. Our advice is to break the problem up into smaller parts you can run separately. Check that each part is doing what you expect, then move on to the next. You'll likely find that as you work through each part, you'll need to go back and modify your previous code. This is totally normal!
When training make sure you're updating only the weights of the feed-forward network. You should be able to get the validation accuracy above 70% if you build everything right. Make sure to try different hyperparameters (learning rate, units in the classifier, epochs, etc) to find the best model. Save those hyperparameters to use as default values in the next part of the project.
One last important tip if you're using the workspace to run your code: To avoid having your workspace disconnect during the long-running tasks in this notebook, please read in the earlier page in this lesson called Intro to GPU Workspaces about Keeping Your Session Active. You'll want to include code from the workspace_utils.py module.
Note for Workspace users: If your network is over 1 GB when saved as a checkpoint, there might be issues with saving backups in your workspace. Typically this happens with wide dense layers after the convolutional layers. If your saved checkpoint is larger than 1 GB (you can open a terminal and check with ls -lh), you should reduce the size of your hidden layers and train again.
def imshow(image, ax=None, title=None, normalize=True):
"""Imshow for Tensor."""
if ax is None:
fig, ax = plt.subplots()
image = image.numpy().transpose((1, 2, 0))
if normalize:
mean = np.array([0.485, 0.456, 0.406])
std = np.array([0.229, 0.224, 0.225])
image = std * image + mean
image = np.clip(image, 0, 1)
ax.imshow(image)
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['left'].set_visible(False)
ax.spines['bottom'].set_visible(False)
ax.tick_params(axis='both', length=0)
ax.set_xticklabels('')
ax.set_yticklabels('')
return ax
# Test to see if our images are loaded correctly
images, labels = next(iter(dataloaders['train']))
imshow(images[0], normalize=True)
print(labels)
tensor([ 65, 48, 4, 37, 40, 6, 73, 77, 93, 56, 88, 2, 99, 12,
80, 96, 48, 101, 62, 9, 26, 100, 41, 84, 5, 83, 97, 71,
49, 50, 3, 23])
# TODO: Build and train your network
model = models.vgg19(pretrained=True)
model.class_to_idx = image_datasets['train'].class_to_idx
model
VGG(
(features): Sequential(
(0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(1): ReLU(inplace=True)
(2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(3): ReLU(inplace=True)
(4): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(5): Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(6): ReLU(inplace=True)
(7): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(8): ReLU(inplace=True)
(9): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(10): Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(11): ReLU(inplace=True)
(12): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(13): ReLU(inplace=True)
(14): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(15): ReLU(inplace=True)
(16): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(17): ReLU(inplace=True)
(18): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(19): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(20): ReLU(inplace=True)
(21): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(22): ReLU(inplace=True)
(23): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(24): ReLU(inplace=True)
(25): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(26): ReLU(inplace=True)
(27): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(28): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(29): ReLU(inplace=True)
(30): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(31): ReLU(inplace=True)
(32): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(33): ReLU(inplace=True)
(34): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(35): ReLU(inplace=True)
(36): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
)
(avgpool): AdaptiveAvgPool2d(output_size=(7, 7))
(classifier): Sequential(
(0): Linear(in_features=25088, out_features=4096, bias=True)
(1): ReLU(inplace=True)
(2): Dropout(p=0.5, inplace=False)
(3): Linear(in_features=4096, out_features=4096, bias=True)
(4): ReLU(inplace=True)
(5): Dropout(p=0.5, inplace=False)
(6): Linear(in_features=4096, out_features=1000, bias=True)
)
)
model.class_to_idx = image_datasets['train'].class_to_idx
def get_optimizer(state_dict=None):
"""
returns an instanitated Adam optimizer, and will load
state if applicable
"""
optimizer = optim.Adam(model.classifier.parameters(), lr=0.001)
if state_dict:
optimizer.load_state_dict(state_dict)
return optimizer
def get_classifier():
return nn.Sequential(
OrderedDict([
('fc1', nn.Linear(25088, 4096)),
('relu', nn.ReLU()),
('dropout', nn.Dropout(p=0.2, inplace=False)),
('fc2', nn.Linear(4096, 2048)),
('relu2', nn.ReLU()),
('dropout2',nn.Dropout(p=0.5, inplace=False)),
('fc3', nn.Linear(2048, 102)),
('output', nn.LogSoftmax(dim=1))
]))
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
for param in model.parameters():
param.requires_grad = False
model.classifier = get_classifier()
# define our loss and optimiziers
criterion = nn.NLLLoss()
optimizer = get_optimizer()
print(device)
cuda
model
VGG(
(features): Sequential(
(0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(1): ReLU(inplace=True)
(2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(3): ReLU(inplace=True)
(4): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(5): Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(6): ReLU(inplace=True)
(7): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(8): ReLU(inplace=True)
(9): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(10): Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(11): ReLU(inplace=True)
(12): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(13): ReLU(inplace=True)
(14): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(15): ReLU(inplace=True)
(16): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(17): ReLU(inplace=True)
(18): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(19): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(20): ReLU(inplace=True)
(21): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(22): ReLU(inplace=True)
(23): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(24): ReLU(inplace=True)
(25): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(26): ReLU(inplace=True)
(27): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
(28): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(29): ReLU(inplace=True)
(30): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(31): ReLU(inplace=True)
(32): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(33): ReLU(inplace=True)
(34): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(35): ReLU(inplace=True)
(36): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
)
(avgpool): AdaptiveAvgPool2d(output_size=(7, 7))
(classifier): Sequential(
(fc1): Linear(in_features=25088, out_features=4096, bias=True)
(relu): ReLU()
(dropout): Dropout(p=0.2, inplace=False)
(fc2): Linear(in_features=4096, out_features=2048, bias=True)
(relu2): ReLU()
(dropout2): Dropout(p=0.5, inplace=False)
(fc3): Linear(in_features=2048, out_features=102, bias=True)
(output): LogSoftmax(dim=1)
)
)
def validate_model(model, criterion, dataloader):
model.eval()
model.to(device=device)
accuracy, test_loss = 0, 0
for inputs, labels in dataloader:
inputs, labels = inputs.to(device), labels.to(device)
output = model.forward(inputs)
test_loss += criterion(output, labels).item()
ps = torch.exp(output)
top_p, top_class = ps.topk(1, dim=1)
equals = top_class == labels.view(*top_class.shape)
accuracy += torch.mean(equals.type(torch.FloatTensor)).item()
return test_loss/len(dataloader), accuracy/len(dataloader)
model.to(device)
epochs = 10
steps = 0
running_loss = 0
print_every = 15
for e in range(epochs):
running_loss = 0
for inputs, labels in dataloaders['train']:
steps += 1
# Move our inputs and labels to the default device, which in this example is the cpu
inputs, labels = inputs.to(device), labels.to(device)
optimizer.zero_grad()
log_ps = model(inputs)
loss = criterion(log_ps, labels)
loss.backward()
optimizer.step()
# track a running total of of loss as we train the network
running_loss += loss.item()
if steps % print_every == 0:
loss, accuracy = validate_model(model=model, criterion=criterion, dataloader=dataloaders['valid'])
print("Epoch: {}/{} ".format(e+1, epochs),
"Training Loss: {:.3f} ".format(running_loss/print_every),
"Validation Loss: {:.3f} ".format(loss),
"Validation Accuracy: {:.3f}".format(accuracy))
running_loss = 0
# Put model back in training mode
model.train()
Epoch: 1/10 Training Loss: 2.527 Validation Loss: 2.157 Validation Accuracy: 0.425 Epoch: 1/10 Training Loss: 2.458 Validation Loss: 1.986 Validation Accuracy: 0.490 Epoch: 1/10 Training Loss: 2.435 Validation Loss: 1.744 Validation Accuracy: 0.539 Epoch: 1/10 Training Loss: 2.392 Validation Loss: 1.756 Validation Accuracy: 0.516 Epoch: 1/10 Training Loss: 2.599 Validation Loss: 1.655 Validation Accuracy: 0.545 Epoch: 1/10 Training Loss: 2.233 Validation Loss: 1.511 Validation Accuracy: 0.579 Epoch: 1/10 Training Loss: 2.315 Validation Loss: 1.583 Validation Accuracy: 0.556 Epoch: 1/10 Training Loss: 2.249 Validation Loss: 1.443 Validation Accuracy: 0.598 Epoch: 1/10 Training Loss: 2.073 Validation Loss: 1.331 Validation Accuracy: 0.628 Epoch: 1/10 Training Loss: 2.084 Validation Loss: 1.327 Validation Accuracy: 0.631 Epoch: 1/10 Training Loss: 2.002 Validation Loss: 1.327 Validation Accuracy: 0.649 Epoch: 1/10 Training Loss: 1.894 Validation Loss: 1.273 Validation Accuracy: 0.655 Epoch: 1/10 Training Loss: 2.075 Validation Loss: 1.225 Validation Accuracy: 0.663 Epoch: 2/10 Training Loss: 0.680 Validation Loss: 1.295 Validation Accuracy: 0.654 Epoch: 2/10 Training Loss: 1.729 Validation Loss: 1.240 Validation Accuracy: 0.670 Epoch: 2/10 Training Loss: 1.761 Validation Loss: 1.210 Validation Accuracy: 0.661 Epoch: 2/10 Training Loss: 1.813 Validation Loss: 1.316 Validation Accuracy: 0.648 Epoch: 2/10 Training Loss: 1.872 Validation Loss: 1.145 Validation Accuracy: 0.671 Epoch: 2/10 Training Loss: 1.730 Validation Loss: 1.197 Validation Accuracy: 0.690 Epoch: 2/10 Training Loss: 1.831 Validation Loss: 1.133 Validation Accuracy: 0.698 Epoch: 2/10 Training Loss: 1.831 Validation Loss: 1.090 Validation Accuracy: 0.715 Epoch: 2/10 Training Loss: 1.782 Validation Loss: 1.013 Validation Accuracy: 0.721 Epoch: 2/10 Training Loss: 1.689 Validation Loss: 1.034 Validation Accuracy: 0.719 Epoch: 2/10 Training Loss: 1.905 Validation Loss: 0.959 Validation Accuracy: 0.742 Epoch: 2/10 Training Loss: 1.734 Validation Loss: 1.020 Validation Accuracy: 0.731 Epoch: 2/10 Training Loss: 1.743 Validation Loss: 1.033 Validation Accuracy: 0.720 Epoch: 2/10 Training Loss: 1.880 Validation Loss: 1.025 Validation Accuracy: 0.711 Epoch: 3/10 Training Loss: 1.171 Validation Loss: 1.079 Validation Accuracy: 0.716 Epoch: 3/10 Training Loss: 1.780 Validation Loss: 1.062 Validation Accuracy: 0.723 Epoch: 3/10 Training Loss: 1.630 Validation Loss: 1.118 Validation Accuracy: 0.706 Epoch: 3/10 Training Loss: 1.634 Validation Loss: 0.884 Validation Accuracy: 0.768 Epoch: 3/10 Training Loss: 1.572 Validation Loss: 0.956 Validation Accuracy: 0.755 Epoch: 3/10 Training Loss: 1.730 Validation Loss: 1.065 Validation Accuracy: 0.715 Epoch: 3/10 Training Loss: 1.784 Validation Loss: 0.983 Validation Accuracy: 0.730 Epoch: 3/10 Training Loss: 1.681 Validation Loss: 1.036 Validation Accuracy: 0.734 Epoch: 3/10 Training Loss: 1.628 Validation Loss: 0.970 Validation Accuracy: 0.742 Epoch: 3/10 Training Loss: 1.584 Validation Loss: 0.882 Validation Accuracy: 0.747 Epoch: 3/10 Training Loss: 1.479 Validation Loss: 0.905 Validation Accuracy: 0.759 Epoch: 3/10 Training Loss: 1.650 Validation Loss: 0.999 Validation Accuracy: 0.723 Epoch: 3/10 Training Loss: 1.780 Validation Loss: 0.874 Validation Accuracy: 0.765 Epoch: 3/10 Training Loss: 1.581 Validation Loss: 0.906 Validation Accuracy: 0.753 Epoch: 4/10 Training Loss: 1.508 Validation Loss: 0.891 Validation Accuracy: 0.763 Epoch: 4/10 Training Loss: 1.587 Validation Loss: 0.925 Validation Accuracy: 0.750 Epoch: 4/10 Training Loss: 1.528 Validation Loss: 0.851 Validation Accuracy: 0.775 Epoch: 4/10 Training Loss: 1.696 Validation Loss: 0.952 Validation Accuracy: 0.738 Epoch: 4/10 Training Loss: 1.498 Validation Loss: 0.838 Validation Accuracy: 0.781 Epoch: 4/10 Training Loss: 1.672 Validation Loss: 0.868 Validation Accuracy: 0.768 Epoch: 4/10 Training Loss: 1.385 Validation Loss: 0.864 Validation Accuracy: 0.774 Epoch: 4/10 Training Loss: 1.461 Validation Loss: 0.766 Validation Accuracy: 0.791 Epoch: 4/10 Training Loss: 1.500 Validation Loss: 0.793 Validation Accuracy: 0.797 Epoch: 4/10 Training Loss: 1.422 Validation Loss: 0.796 Validation Accuracy: 0.793 Epoch: 4/10 Training Loss: 1.571 Validation Loss: 0.729 Validation Accuracy: 0.809 Epoch: 4/10 Training Loss: 1.458 Validation Loss: 0.745 Validation Accuracy: 0.791 Epoch: 4/10 Training Loss: 1.408 Validation Loss: 0.849 Validation Accuracy: 0.780 Epoch: 5/10 Training Loss: 0.479 Validation Loss: 0.762 Validation Accuracy: 0.806 Epoch: 5/10 Training Loss: 1.646 Validation Loss: 0.922 Validation Accuracy: 0.759 Epoch: 5/10 Training Loss: 1.398 Validation Loss: 0.826 Validation Accuracy: 0.794 Epoch: 5/10 Training Loss: 1.482 Validation Loss: 0.736 Validation Accuracy: 0.804 Epoch: 5/10 Training Loss: 1.434 Validation Loss: 0.888 Validation Accuracy: 0.767 Epoch: 5/10 Training Loss: 1.532 Validation Loss: 0.773 Validation Accuracy: 0.778 Epoch: 5/10 Training Loss: 1.459 Validation Loss: 0.874 Validation Accuracy: 0.786 Epoch: 5/10 Training Loss: 1.427 Validation Loss: 0.751 Validation Accuracy: 0.794 Epoch: 5/10 Training Loss: 1.438 Validation Loss: 0.759 Validation Accuracy: 0.805 Epoch: 5/10 Training Loss: 1.465 Validation Loss: 0.822 Validation Accuracy: 0.791 Epoch: 5/10 Training Loss: 1.513 Validation Loss: 0.753 Validation Accuracy: 0.811 Epoch: 5/10 Training Loss: 1.482 Validation Loss: 0.827 Validation Accuracy: 0.782 Epoch: 5/10 Training Loss: 1.487 Validation Loss: 0.797 Validation Accuracy: 0.798 Epoch: 5/10 Training Loss: 1.429 Validation Loss: 0.888 Validation Accuracy: 0.772 Epoch: 6/10 Training Loss: 0.829 Validation Loss: 0.784 Validation Accuracy: 0.797 Epoch: 6/10 Training Loss: 1.322 Validation Loss: 0.788 Validation Accuracy: 0.788 Epoch: 6/10 Training Loss: 1.464 Validation Loss: 0.802 Validation Accuracy: 0.792 Epoch: 6/10 Training Loss: 1.424 Validation Loss: 0.740 Validation Accuracy: 0.799 Epoch: 6/10 Training Loss: 1.347 Validation Loss: 0.760 Validation Accuracy: 0.800 Epoch: 6/10 Training Loss: 1.512 Validation Loss: 0.723 Validation Accuracy: 0.801 Epoch: 6/10 Training Loss: 1.462 Validation Loss: 0.789 Validation Accuracy: 0.787 Epoch: 6/10 Training Loss: 1.399 Validation Loss: 0.774 Validation Accuracy: 0.804 Epoch: 6/10 Training Loss: 1.556 Validation Loss: 0.740 Validation Accuracy: 0.796 Epoch: 6/10 Training Loss: 1.274 Validation Loss: 0.751 Validation Accuracy: 0.808 Epoch: 6/10 Training Loss: 1.378 Validation Loss: 0.697 Validation Accuracy: 0.828 Epoch: 6/10 Training Loss: 1.398 Validation Loss: 0.723 Validation Accuracy: 0.806 Epoch: 6/10 Training Loss: 1.303 Validation Loss: 0.693 Validation Accuracy: 0.819 Epoch: 6/10 Training Loss: 1.420 Validation Loss: 0.721 Validation Accuracy: 0.800 Epoch: 7/10 Training Loss: 1.482 Validation Loss: 0.791 Validation Accuracy: 0.811 Epoch: 7/10 Training Loss: 1.374 Validation Loss: 0.757 Validation Accuracy: 0.800 Epoch: 7/10 Training Loss: 1.466 Validation Loss: 0.773 Validation Accuracy: 0.799 Epoch: 7/10 Training Loss: 1.377 Validation Loss: 0.783 Validation Accuracy: 0.803 Epoch: 7/10 Training Loss: 1.394 Validation Loss: 0.819 Validation Accuracy: 0.789 Epoch: 7/10 Training Loss: 1.449 Validation Loss: 0.733 Validation Accuracy: 0.809 Epoch: 7/10 Training Loss: 1.351 Validation Loss: 0.704 Validation Accuracy: 0.812 Epoch: 7/10 Training Loss: 1.275 Validation Loss: 0.747 Validation Accuracy: 0.816 Epoch: 7/10 Training Loss: 1.284 Validation Loss: 0.736 Validation Accuracy: 0.808 Epoch: 7/10 Training Loss: 1.429 Validation Loss: 0.810 Validation Accuracy: 0.803 Epoch: 7/10 Training Loss: 1.398 Validation Loss: 0.811 Validation Accuracy: 0.795 Epoch: 7/10 Training Loss: 1.476 Validation Loss: 0.765 Validation Accuracy: 0.790 Epoch: 7/10 Training Loss: 1.458 Validation Loss: 0.729 Validation Accuracy: 0.813 Epoch: 8/10 Training Loss: 0.363 Validation Loss: 0.680 Validation Accuracy: 0.817 Epoch: 8/10 Training Loss: 1.181 Validation Loss: 0.672 Validation Accuracy: 0.826 Epoch: 8/10 Training Loss: 1.127 Validation Loss: 0.727 Validation Accuracy: 0.814 Epoch: 8/10 Training Loss: 1.279 Validation Loss: 0.708 Validation Accuracy: 0.814 Epoch: 8/10 Training Loss: 1.459 Validation Loss: 0.665 Validation Accuracy: 0.822 Epoch: 8/10 Training Loss: 1.236 Validation Loss: 0.722 Validation Accuracy: 0.832 Epoch: 8/10 Training Loss: 1.355 Validation Loss: 0.752 Validation Accuracy: 0.805 Epoch: 8/10 Training Loss: 1.493 Validation Loss: 0.731 Validation Accuracy: 0.817 Epoch: 8/10 Training Loss: 1.587 Validation Loss: 0.703 Validation Accuracy: 0.829 Epoch: 8/10 Training Loss: 1.393 Validation Loss: 0.744 Validation Accuracy: 0.813 Epoch: 8/10 Training Loss: 1.469 Validation Loss: 0.749 Validation Accuracy: 0.804 Epoch: 8/10 Training Loss: 1.355 Validation Loss: 0.793 Validation Accuracy: 0.802 Epoch: 8/10 Training Loss: 1.393 Validation Loss: 0.716 Validation Accuracy: 0.809 Epoch: 8/10 Training Loss: 1.478 Validation Loss: 0.862 Validation Accuracy: 0.802 Epoch: 9/10 Training Loss: 1.016 Validation Loss: 0.809 Validation Accuracy: 0.785 Epoch: 9/10 Training Loss: 1.268 Validation Loss: 0.688 Validation Accuracy: 0.807 Epoch: 9/10 Training Loss: 1.307 Validation Loss: 0.727 Validation Accuracy: 0.817 Epoch: 9/10 Training Loss: 1.488 Validation Loss: 0.841 Validation Accuracy: 0.787 Epoch: 9/10 Training Loss: 1.271 Validation Loss: 0.669 Validation Accuracy: 0.821 Epoch: 9/10 Training Loss: 1.387 Validation Loss: 0.638 Validation Accuracy: 0.827 Epoch: 9/10 Training Loss: 1.242 Validation Loss: 0.680 Validation Accuracy: 0.824 Epoch: 9/10 Training Loss: 1.183 Validation Loss: 0.707 Validation Accuracy: 0.818 Epoch: 9/10 Training Loss: 1.307 Validation Loss: 0.732 Validation Accuracy: 0.818 Epoch: 9/10 Training Loss: 1.467 Validation Loss: 0.717 Validation Accuracy: 0.823 Epoch: 9/10 Training Loss: 1.460 Validation Loss: 0.711 Validation Accuracy: 0.816 Epoch: 9/10 Training Loss: 1.233 Validation Loss: 0.766 Validation Accuracy: 0.829 Epoch: 9/10 Training Loss: 1.374 Validation Loss: 0.714 Validation Accuracy: 0.831 Epoch: 9/10 Training Loss: 1.330 Validation Loss: 0.697 Validation Accuracy: 0.835 Epoch: 10/10 Training Loss: 1.368 Validation Loss: 0.760 Validation Accuracy: 0.811 Epoch: 10/10 Training Loss: 1.117 Validation Loss: 0.759 Validation Accuracy: 0.823 Epoch: 10/10 Training Loss: 1.315 Validation Loss: 0.768 Validation Accuracy: 0.807 Epoch: 10/10 Training Loss: 1.344 Validation Loss: 0.803 Validation Accuracy: 0.816 Epoch: 10/10 Training Loss: 1.229 Validation Loss: 0.768 Validation Accuracy: 0.812 Epoch: 10/10 Training Loss: 1.162 Validation Loss: 0.793 Validation Accuracy: 0.808 Epoch: 10/10 Training Loss: 1.314 Validation Loss: 0.732 Validation Accuracy: 0.823 Epoch: 10/10 Training Loss: 1.508 Validation Loss: 0.717 Validation Accuracy: 0.830 Epoch: 10/10 Training Loss: 1.484 Validation Loss: 0.789 Validation Accuracy: 0.808 Epoch: 10/10 Training Loss: 1.521 Validation Loss: 0.775 Validation Accuracy: 0.810 Epoch: 10/10 Training Loss: 1.333 Validation Loss: 0.743 Validation Accuracy: 0.803 Epoch: 10/10 Training Loss: 1.382 Validation Loss: 0.687 Validation Accuracy: 0.830 Epoch: 10/10 Training Loss: 1.266 Validation Loss: 0.622 Validation Accuracy: 0.846
It's good practice to test your trained network on test data, images the network has never seen either in training or validation. This will give you a good estimate for the model's performance on completely new images. Run the test images through the network and measure the accuracy, the same way you did validation. You should be able to reach around 70% accuracy on the test set if the model has been trained well.
# TODO: Do validation on the test set
import time
def test_network(model, data_type, device, criterion):
model.to(device=device)
test_loss = 0
accuracy = 0
model.eval()
print(f'Runing on : {device}')
with torch.no_grad():
for inputs, labels in dataloaders[data_type]:
inputs, labels = inputs.to(device), labels.to(device)
logps = model.forward(inputs)
batch_loss = criterion(logps, labels)
test_loss += batch_loss.item()
# accuracy
ps = torch.exp(logps)
top_p, top_class = ps.topk(1, dim=1)
equals = top_class == labels.view(*top_class.shape)
accuracy += torch.mean(equals.type(torch.FloatTensor)).item()
print(f'Test Accuracy: {accuracy/len(dataloaders["test"]):3f}')
print('-----')
model.train()
test_network(model=model, data_type='test', device=device, criterion=criterion)
Runing on : cuda Test Accuracy: 0.807629 -----
Now that your network is trained, save the model so you can load it later for making predictions. You probably want to save other things such as the mapping of classes to indices which you get from one of the image datasets: image_datasets['train'].class_to_idx. You can attach this to the model as an attribute which makes inference easier later on.
model.class_to_idx = image_datasets['train'].class_to_idx
Remember that you'll want to completely rebuild the model later so you can use it for inference. Make sure to include any information you need in the checkpoint. If you want to load the model and keep training, you'll want to save the number of epochs as well as the optimizer state, optimizer.state_dict. You'll likely want to use this trained model in the next part of the project, so best to save it now.
# TODO: Save the checkpoint
model.class_to_idx = image_datasets['train']
state = {
'model': 'vgg19',
'epoch': epochs,
'state_dict': model.state_dict(),
'optimizer': optimizer.state_dict(),
'class_to_idx': model.class_to_idx
}
torch.save(state, 'checkpoint.pth')
At this point it's good to write a function that can load a checkpoint and rebuild the model. That way you can come back to this project and keep working on it without having to retrain the network.
# TODO: Write a function that loads a checkpoint and rebuilds the model
def load_checkpoint(path):
checkpoint = torch.load(path)
print(checkpoint)
if checkpoint['model'] == 'vgg19':
model = models.vgg19(pretrained=True)
else:
raise Exception("Model type not valid")
for p in model.parameters():
p.requires_grad = False
classifier = get_classifier()
model.classifier = classifier
model.load_state_dict(checkpoint['state_dict'])
model.class_to_idx = checkpoint['class_to_idx']
optimizer = get_optimizer(state_dict=checkpoint['optimizer'])
return model, optimizer
# Lets test our saving and loading
model2, optimizer2 = load_checkpoint('checkpoint.pth')
{'model': 'vgg19', 'epoch': 10, 'state_dict': OrderedDict([('features.0.weight', tensor([[[[-5.3474e-02, -4.9257e-02, -6.7942e-02],
[ 1.5314e-02, 4.5068e-02, 2.1444e-03],
[ 3.6226e-02, 1.9999e-02, 1.9864e-02]],
[[ 1.7015e-02, 5.5403e-02, -6.2293e-03],
[ 1.4165e-01, 2.2705e-01, 1.3758e-01],
[ 1.2000e-01, 2.0030e-01, 9.2114e-02]],
[[-4.4885e-02, 1.2680e-02, -1.4497e-02],
[ 5.9742e-02, 1.3955e-01, 5.4102e-02],
[-9.6141e-04, 5.8304e-02, -2.9663e-02]]],
[[[ 2.6072e-01, -3.0489e-01, -5.0152e-01],
[ 4.1376e-01, -2.0831e-01, -4.9086e-01],
[ 5.8770e-01, 4.2851e-01, -1.3850e-01]],
[[ 2.8746e-01, -3.3338e-01, -4.5564e-01],
[ 3.7836e-01, -2.9144e-01, -4.9720e-01],
[ 5.4778e-01, 4.8983e-01, -1.7166e-01]],
[[ 6.7260e-02, -9.5386e-02, -3.8037e-02],
[ 6.1955e-02, -1.3125e-01, -1.0691e-01],
[ 4.8107e-02, 2.2999e-01, -3.0578e-02]]],
[[[-3.2457e-02, 1.6281e-01, 5.9687e-02],
[ 1.3960e-01, 3.7732e-01, 2.3204e-01],
[ 3.0062e-02, 1.9476e-01, 8.5276e-02]],
[[-9.5406e-02, 9.6072e-02, -2.5564e-02],
[ 2.3299e-02, 2.8450e-01, 9.4697e-02],
[-1.4335e-01, -6.8587e-05, -1.0202e-01]],
[[-1.2480e-01, 5.2403e-02, -2.6687e-02],
[-4.1414e-02, 1.7935e-01, 4.9905e-02],
[-1.1839e-01, -2.0942e-02, -1.0207e-01]]],
...,
[[[-5.2884e-02, -1.1182e-01, -2.2377e-02],
[-1.8517e-01, -2.6329e-01, -4.5673e-02],
[-1.2462e-01, -1.5776e-01, -2.5907e-02]],
[[-6.8542e-02, -1.0528e-01, -5.6703e-02],
[-1.4858e-01, -1.7634e-01, 3.1325e-02],
[-9.2168e-02, -4.9276e-02, 3.2291e-02]],
[[ 1.2216e-01, 2.0694e-01, 1.8405e-01],
[ 1.5762e-01, 2.3937e-01, 2.8790e-01],
[ 8.4974e-02, 1.7520e-01, 1.5766e-01]]],
[[[ 2.5902e-02, 3.8224e-01, 2.9388e-01],
[-4.7560e-01, -3.6006e-01, 2.3282e-01],
[-2.2283e-01, 8.3313e-04, 1.5329e-01]],
[[ 1.7674e-01, 3.8770e-01, 2.6036e-02],
[-3.9036e-01, -5.0041e-01, 1.7150e-03],
[ 6.1660e-02, 1.4792e-01, 5.1035e-02]],
[[ 1.2967e-01, 7.5381e-02, -3.8851e-01],
[ 5.0931e-02, -1.9381e-01, -1.7501e-01],
[ 3.4483e-01, 2.1557e-01, -8.3478e-02]]],
[[[-8.1056e-01, -7.4319e-01, -7.7885e-01],
[-1.6934e-01, 3.4232e-01, -7.0197e-02],
[ 5.2494e-01, 9.5989e-01, 7.6209e-01]],
[[ 7.9164e-02, 2.4559e-01, -1.5317e-01],
[-7.0860e-02, 4.4652e-01, -3.8074e-01],
[-1.5309e-01, 1.2427e-01, -1.1070e-01]],
[[ 5.2029e-01, 7.5736e-01, 6.2371e-01],
[-1.0733e-01, 1.8762e-01, -1.2183e-01],
[-6.6407e-01, -6.4891e-01, -5.5356e-01]]]])), ('features.0.bias', tensor([-0.9130, 0.3068, -1.3064, -0.7762, -0.7888, -0.4155, 0.2666, -0.8560,
0.3901, 0.1206, 0.2143, 0.3767, 0.2672, -0.8205, 0.0463, 0.4325,
0.3040, -0.1048, 0.4146, 0.3701, 0.4728, 0.4447, 0.1775, -1.1050,
0.3911, -0.8114, 0.0029, 0.2943, 0.2926, 0.5354, 0.4415, 0.4302,
0.5140, 0.4039, 0.4905, 0.3326, 0.3844, 0.3955, 0.4361, 0.2157,
0.2640, 0.3557, -0.7006, -0.0398, 0.1095, -1.2560, 0.0400, -0.2300,
0.0763, -0.4009, -0.8053, 0.3830, -0.2696, -0.3153, 0.4309, 0.3720,
-0.2352, -0.2580, 0.2720, 0.2830, -0.2227, -0.1897, 0.3060, 0.3920])), ('features.2.weight', tensor([[[[ 5.3294e-02, 8.2804e-02, 8.5524e-02],
[ 2.6976e-02, 3.2650e-02, 5.4144e-02],
[-4.4511e-02, -5.2258e-03, -4.1381e-03]],
[[-1.9152e-02, 2.2080e-02, 1.0595e-02],
[-4.2938e-02, -1.4895e-02, -4.5747e-02],
[-1.8811e-02, -2.8152e-02, -6.3361e-02]],
[[ 4.0121e-02, 1.2453e-01, 9.8356e-02],
[-3.7617e-02, 4.6662e-02, -4.3144e-03],
[-1.1518e-01, -9.6260e-02, -8.5768e-02]],
...,
[[-4.8365e-02, -7.3883e-02, -8.2087e-02],
[ 4.3794e-02, 1.8011e-02, -1.8619e-04],
[ 1.0096e-01, 8.9305e-02, 4.2637e-02]],
[[ 2.3180e-02, 8.8373e-03, -7.0260e-02],
[ 6.0539e-02, 2.2940e-02, 1.7419e-02],
[ 2.5101e-03, 7.3809e-02, 3.4366e-02]],
[[-6.5116e-02, -6.6284e-02, -8.2296e-02],
[-1.2183e-01, -1.4542e-01, -1.4393e-01],
[-1.5220e-01, -1.7159e-01, -1.3308e-01]]],
[[[ 2.6774e-02, 2.9679e-02, 2.7089e-02],
[ 2.6548e-02, 3.1258e-02, 2.2046e-02],
[ 3.1919e-03, 2.0701e-02, 2.0648e-02]],
[[ 4.7955e-03, -9.9883e-03, -1.4898e-03],
[-1.3246e-02, -2.0732e-02, -4.8087e-03],
[-6.4887e-03, -8.0587e-03, -2.8962e-03]],
[[-1.4939e-02, -1.3725e-02, -3.3936e-02],
[-1.3005e-02, 6.6903e-03, -1.6996e-02],
[-1.8187e-02, -1.9412e-02, -3.9566e-03]],
...,
[[-1.1502e-03, 1.2112e-02, -1.3131e-02],
[-8.8473e-04, -1.1336e-02, 4.0219e-03],
[-1.1732e-02, 2.0105e-02, -4.5205e-03]],
[[-3.1948e-02, -1.6055e-02, -1.2947e-02],
[-3.5695e-02, -2.2709e-02, -1.2281e-02],
[-1.8461e-02, -7.0708e-03, -1.5288e-02]],
[[-3.1178e-02, -2.1091e-02, -1.8830e-02],
[-4.8438e-03, -1.0917e-02, 8.9276e-03],
[-2.3880e-03, -2.3102e-02, -1.8260e-02]]],
[[[ 1.8644e-02, -2.4135e-02, -2.8590e-02],
[ 8.0604e-03, -2.5701e-02, -5.2046e-02],
[ 4.0025e-02, 4.8416e-03, -1.9944e-02]],
[[-3.0429e-02, -1.6110e-02, 2.0146e-02],
[-1.3046e-02, -2.7389e-02, 7.8057e-03],
[-2.1391e-02, -2.2033e-02, -2.1858e-02]],
[[ 2.6014e-02, 6.4312e-02, 3.6081e-02],
[ 3.5382e-03, 1.6420e-02, 3.6607e-02],
[ 9.6195e-04, 3.4792e-03, 9.2224e-03]],
...,
[[-6.4644e-02, -8.5212e-02, -8.6735e-02],
[-4.6778e-03, -7.3905e-02, -8.6241e-02],
[ 4.3995e-03, -3.3565e-02, -6.9997e-02]],
[[ 3.2120e-02, 2.6767e-02, -4.1272e-03],
[ 2.8396e-02, 9.7218e-03, -2.2945e-02],
[-1.0873e-02, 2.9197e-02, -2.8620e-02]],
[[ 9.9534e-03, 8.9921e-02, 5.4278e-02],
[-1.7954e-02, -7.2821e-03, 2.4107e-02],
[-6.2150e-02, -7.1851e-02, -3.6793e-02]]],
...,
[[[-1.1987e-02, -1.5818e-02, 2.6580e-03],
[-3.1439e-02, -2.1300e-02, 3.9251e-03],
[ 1.5343e-03, 7.0272e-03, 4.0676e-02]],
[[-9.9105e-03, 2.9468e-02, 3.4794e-02],
[ 7.9550e-03, 8.1483e-03, 5.2150e-02],
[-1.0531e-03, 2.2489e-02, 3.7157e-02]],
[[ 1.1501e-01, 3.5497e-02, 1.4913e-02],
[ 4.1197e-02, -1.4312e-02, -2.9663e-02],
[-2.4506e-02, -9.0076e-02, -1.0564e-01]],
...,
[[-1.9645e-01, -1.6371e-01, 2.0088e-02],
[-1.1902e-01, -8.8441e-02, 5.2635e-02],
[ 1.8133e-02, 7.3808e-02, 2.0676e-01]],
[[ 7.0686e-02, -1.2718e-02, -1.0972e-01],
[ 4.0598e-02, -3.7044e-02, -6.4024e-02],
[ 4.4543e-02, 2.8005e-03, -4.0741e-02]],
[[ 3.6190e-01, 1.8384e-01, -9.2538e-02],
[-4.4248e-02, -2.1772e-01, -2.0390e-01],
[-3.4957e-01, -3.5490e-01, -2.5407e-01]]],
[[[-7.8978e-02, -5.8846e-02, -6.1313e-02],
[-2.0378e-02, -3.5234e-02, -3.9082e-02],
[-2.4277e-02, -1.0150e-02, -9.4495e-03]],
[[-9.7051e-03, 1.6577e-02, 6.8371e-03],
[-6.4778e-02, -2.8700e-02, -2.5572e-02],
[-5.4529e-02, -3.3315e-02, -1.3923e-02]],
[[-9.2987e-02, -8.6146e-02, -7.9143e-02],
[-7.8015e-02, -6.2295e-02, -6.5622e-02],
[-5.5117e-02, -4.6543e-02, -1.9676e-02]],
...,
[[-3.2977e-02, -3.9825e-02, -4.9372e-02],
[-5.7853e-02, -3.2436e-02, -5.2240e-02],
[-4.2355e-02, -5.8713e-02, -7.6644e-02]],
[[ 7.0584e-03, 8.8199e-03, 3.8441e-02],
[-1.5273e-02, -2.1551e-02, 1.0370e-02],
[ 7.4092e-03, 1.3412e-02, 3.1040e-02]],
[[-2.7357e-02, -3.5996e-02, -2.9097e-02],
[ 1.0370e-02, -1.0657e-02, 9.2231e-03],
[ 1.5779e-02, 1.2658e-02, -7.8883e-03]]],
[[[-3.4105e-02, 6.7592e-03, 6.6066e-02],
[-3.7252e-02, -1.2786e-02, 4.4184e-02],
[-3.6817e-02, 1.0461e-02, 2.1789e-02]],
[[ 5.9445e-02, -8.0318e-02, 6.0136e-02],
[ 8.4551e-02, 3.2097e-02, -2.1433e-02],
[-4.6270e-02, 8.4302e-02, -8.1161e-02]],
[[-1.2729e-02, 8.2787e-03, 1.1650e-02],
[ 1.6725e-03, 1.8547e-02, 2.0740e-02],
[ 3.2068e-02, -8.4878e-03, 1.3071e-02]],
...,
[[-2.8666e-02, 2.8747e-02, 4.1803e-02],
[-3.6365e-02, 7.7176e-03, 4.4770e-02],
[-5.9720e-02, -3.3900e-02, 4.6291e-02]],
[[ 1.4510e-02, -5.3338e-02, -5.3871e-03],
[-6.0132e-02, 4.2245e-02, 2.2236e-02],
[-6.9056e-02, -4.5865e-02, 8.6894e-02]],
[[ 2.0230e-02, 2.4249e-02, -4.7089e-02],
[ 1.4114e-02, -6.8428e-02, 5.3686e-02],
[ 2.6819e-02, -3.7571e-02, 2.0947e-02]]]])), ('features.2.bias', tensor([-5.8394e-02, -1.4754e-01, 1.8017e-01, -2.8633e-01, 1.4053e-02,
-3.9571e-02, 1.0259e-01, -8.8048e-04, -7.7598e-02, -1.8744e-01,
-3.4763e-02, 2.2564e-02, -6.1512e-02, -3.9896e-01, -2.3920e-01,
1.1534e-01, -1.1614e-01, -2.2889e-01, -1.6896e-01, -2.4117e-01,
5.9063e-01, 4.4781e-04, -3.8955e-02, -8.2188e-01, -1.5366e-01,
2.2464e-02, 4.2782e-01, -5.8604e-02, -4.8700e-02, 3.0054e-01,
-5.6536e-02, 5.2338e-02, 2.4208e-01, -3.2136e-03, 4.8011e-01,
-8.3752e-03, 1.7856e-01, 3.9528e-01, -2.2767e-02, 5.7821e-02,
6.5798e-04, -1.3895e-01, -1.6600e-01, 7.9103e-03, 1.1497e-01,
1.0045e-01, 2.3931e-01, 4.0163e-01, -1.3901e-01, -4.5015e-01,
1.9804e-01, -1.4634e-01, -1.1509e-01, 7.2199e-02, -1.0609e-02,
3.5647e-01, 9.0835e-02, -1.5250e-01, 2.1137e-01, -2.0729e-01,
1.1017e-02, 2.3209e-01, 9.1408e-02, 2.5651e-02])), ('features.5.weight', tensor([[[[-2.3904e-02, 4.7934e-03, 1.7985e-03],
[-3.3069e-02, 9.1337e-02, 5.3385e-02],
[-3.6013e-02, -5.7739e-02, 2.7938e-02]],
[[-1.7091e-02, 1.6409e-02, 5.1454e-03],
[-5.6018e-03, 1.5107e-02, 1.4861e-02],
[-3.2590e-02, 2.3828e-04, 1.0019e-02]],
[[ 7.3900e-02, 6.3416e-02, -4.8415e-03],
[ 2.3677e-02, 1.0052e-02, -2.0522e-02],
[ 7.5609e-03, -4.4406e-03, 4.5216e-05]],
...,
[[-7.2605e-03, -1.9591e-02, 5.3953e-02],
[-2.4696e-02, -6.2119e-02, 2.4012e-02],
[ 4.8663e-02, -1.8699e-02, -7.3196e-03]],
[[-2.5791e-02, -3.6767e-02, 1.6626e-02],
[-4.5946e-03, -5.2197e-02, 1.5602e-02],
[-2.1030e-02, -2.0089e-02, 9.0214e-03]],
[[-3.3459e-02, 4.4165e-05, 7.9814e-02],
[ 4.0092e-02, -2.0911e-03, -1.3580e-02],
[-5.3820e-04, 8.9286e-02, 1.8459e-02]]],
[[[-2.8869e-02, -1.1822e-02, -1.3189e-02],
[-2.5291e-02, -4.1250e-02, -4.1437e-02],
[-4.4275e-02, -1.7290e-02, -1.4289e-02]],
[[-2.2247e-02, -6.9420e-03, -2.5159e-02],
[-2.3300e-02, -1.9245e-02, -1.4979e-02],
[-3.3361e-03, -6.3445e-03, 3.7832e-03]],
[[-1.1191e-02, -1.2995e-02, 3.2515e-03],
[-3.0186e-02, -3.6485e-02, 6.3542e-03],
[-1.5740e-02, -1.9393e-03, 3.5778e-03]],
...,
[[ 4.9005e-02, 3.7070e-02, -3.1305e-03],
[-8.3239e-03, -5.4579e-02, -8.4639e-02],
[-3.5862e-02, -4.8486e-02, -9.4776e-02]],
[[-5.5092e-03, 5.7042e-03, -8.9779e-03],
[ 3.2723e-02, 2.5234e-02, 6.5347e-03],
[ 3.7227e-02, 3.5880e-02, 4.4899e-02]],
[[-1.3448e-02, 1.2474e-01, 2.1517e-02],
[-5.9541e-02, -1.8059e-03, 3.6543e-02],
[ 1.2388e-02, -3.3950e-03, 2.0000e-02]]],
[[[ 3.6980e-02, 4.5826e-02, 2.6944e-02],
[ 4.4331e-02, -3.0718e-02, -6.1814e-03],
[ 2.8113e-02, 2.3958e-02, 1.0168e-02]],
[[-9.0606e-03, 2.8988e-03, -1.4223e-03],
[-6.1344e-04, 3.4917e-03, -4.0613e-03],
[ 4.6107e-03, -2.4847e-03, -4.5760e-03]],
[[-3.4097e-02, 1.2157e-02, 1.6104e-02],
[-2.9814e-02, -1.2230e-04, 2.3901e-02],
[ 4.2855e-03, -2.0701e-02, -8.3939e-04]],
...,
[[ 2.1600e-02, 8.4086e-03, 1.4182e-02],
[-1.8576e-02, -9.9582e-02, -6.8657e-02],
[-2.0997e-02, -1.6601e-02, -1.4148e-02]],
[[-3.3382e-02, -5.2362e-02, -6.3704e-02],
[-2.5547e-04, -4.6948e-03, -3.7894e-02],
[-2.7143e-04, 2.0461e-03, 1.8300e-02]],
[[-2.4318e-02, 1.1851e-01, 5.5574e-02],
[-1.2417e-02, 3.3460e-02, 6.9245e-02],
[ 8.1559e-02, -1.6480e-02, -6.1717e-03]]],
...,
[[[ 5.7918e-02, 1.4534e-01, 1.0669e-01],
[-1.2319e-01, -7.4554e-02, -4.1140e-02],
[-1.4782e-01, -1.0903e-01, -6.8896e-02]],
[[-4.4693e-02, -1.7215e-02, -4.2046e-02],
[-4.3458e-03, 5.9690e-03, -3.2948e-02],
[-5.3128e-02, -3.4381e-02, -8.1820e-02]],
[[-6.4706e-02, -6.0242e-02, -7.0835e-02],
[-6.4842e-02, -5.5183e-02, -4.7335e-02],
[-8.3840e-02, -6.0024e-02, -9.3879e-02]],
...,
[[ 1.5399e-01, 1.1834e-01, -4.4356e-02],
[-4.9909e-02, -1.9334e-01, -1.3299e-01],
[-1.0537e-01, -1.0887e-01, -9.6106e-02]],
[[-6.0758e-02, -8.0889e-02, -6.4580e-02],
[ 7.6854e-03, 4.4245e-03, 1.4925e-02],
[-5.5951e-03, -2.0924e-02, 8.6069e-03]],
[[ 8.7524e-02, -2.3206e-02, -7.1927e-02],
[ 1.0981e-01, -1.1768e-02, -8.1932e-02],
[ 4.6051e-02, 1.1132e-02, -1.7314e-02]]],
[[[-1.7316e-02, -1.0401e-02, -3.8132e-03],
[-2.9786e-02, -6.3108e-02, -4.5218e-02],
[ 3.8457e-02, 5.9068e-02, -8.4268e-03]],
[[ 1.0035e-02, -8.2162e-03, -9.3008e-03],
[-8.4896e-03, -6.4204e-04, 7.9028e-03],
[-2.1791e-03, -6.9253e-03, 2.8039e-03]],
[[-8.1579e-03, 3.9391e-02, 2.5788e-02],
[-8.2464e-03, 2.2815e-02, -1.4079e-02],
[-1.5388e-03, 1.3923e-02, -1.7343e-02]],
...,
[[ 1.1079e-02, 2.7110e-02, -3.3951e-02],
[-3.9773e-02, -2.5015e-02, -3.0129e-02],
[-4.2306e-03, 7.0861e-02, 5.9250e-02]],
[[-1.4169e-02, -5.3754e-02, -6.9534e-03],
[-2.2758e-02, -4.9094e-02, -4.5194e-02],
[-3.0480e-02, -5.0499e-02, -2.6545e-02]],
[[-6.6125e-03, -4.1080e-02, 5.3853e-02],
[-3.1245e-02, -1.0680e-01, -1.6912e-02],
[ 8.4784e-03, -6.3850e-02, 7.5801e-03]]],
[[[-2.1141e-02, -8.4784e-02, -9.0556e-02],
[-3.7117e-02, -7.8037e-02, -3.6485e-02],
[-1.6637e-03, -2.2322e-03, -1.3284e-02]],
[[ 8.5884e-04, -1.5348e-02, -1.9282e-02],
[-6.5435e-03, -1.5647e-02, -4.8958e-03],
[-2.1222e-03, -4.8257e-03, 1.2094e-03]],
[[-6.6725e-04, 9.0939e-03, 2.5315e-02],
[-3.4414e-02, 3.5921e-03, 3.4004e-02],
[-1.4725e-02, 3.0195e-02, 3.0666e-02]],
...,
[[-1.1471e-03, 1.3382e-02, 2.4069e-02],
[ 1.3582e-02, 7.0294e-02, 2.8989e-02],
[ 2.3322e-02, 3.7450e-02, 1.4768e-02]],
[[-1.0523e-02, -4.2178e-02, -2.8341e-02],
[-8.5121e-03, -3.4082e-02, -1.0837e-02],
[ 4.2949e-02, 1.9680e-02, 1.9693e-02]],
[[-3.9678e-02, -7.5239e-02, 4.4415e-03],
[-9.6231e-02, -3.5954e-02, 4.4141e-02],
[-1.7255e-02, 2.4688e-02, 6.7100e-02]]]])), ('features.5.bias', tensor([ 0.0246, 0.1334, 0.0083, 0.0499, 0.0313, 0.0835, -0.0160, 0.1635,
0.0932, 0.2730, -0.0729, -0.1390, 0.1081, 0.2314, -0.0126, -0.0138,
-0.0224, 0.1828, -0.0168, -0.0981, 0.1010, 0.0876, -0.0171, 0.0684,
0.1531, 0.1562, -0.1067, 0.0819, -0.0074, 0.2974, 0.0154, 0.0075,
0.0485, 0.1132, 0.1983, 0.0532, -0.0295, 0.1519, -0.0554, 0.1351,
0.1876, 0.0366, -0.1132, -0.0876, 0.0925, 0.1133, 0.1003, 0.0308,
0.1077, 0.0830, 0.1297, 0.1864, 0.0618, -0.0429, 0.0977, -0.0016,
-0.0371, 0.1010, 0.0302, 0.3531, -0.0024, -0.0248, 0.0886, 0.1082,
0.0553, 0.1412, 0.0944, 0.0037, 0.1766, -0.0232, 0.0941, 0.0587,
0.0581, -0.0023, 0.0879, 0.0208, 0.2336, 0.0670, -0.1960, -0.0462,
0.0868, -0.1003, 0.0229, 0.1479, 0.2186, 0.0067, 0.0624, -0.2049,
0.1046, 0.0477, 0.0359, -0.0860, 0.0925, 0.0408, 0.1270, -0.1244,
-0.0697, 0.2315, 0.0521, 0.1117, 0.0376, 0.0929, -0.0010, 0.0183,
0.0881, 0.3009, -0.0166, 0.0463, 0.1122, 0.1211, 0.0592, 0.0373,
0.1098, -0.0521, -0.0089, 0.1356, -0.0366, 0.0297, 0.1220, 0.1031,
-0.0067, 0.1621, -0.1397, 0.0479, -0.0287, -0.0914, 0.1077, 0.0609])), ('features.7.weight', tensor([[[[ 1.4944e-02, -3.7181e-02, 5.2933e-02],
[-1.6621e-03, -5.2030e-02, -3.8966e-03],
[ 3.9774e-02, -5.6291e-03, -2.2464e-02]],
[[-7.8208e-03, -2.0675e-02, 2.7717e-02],
[-9.7421e-02, -9.1441e-02, -1.7880e-02],
[ 4.1370e-02, 3.5009e-02, 4.7078e-02]],
[[ 4.9653e-03, 3.1277e-02, 2.4781e-02],
[-2.2425e-02, -2.7588e-02, -1.2751e-02],
[ 4.2515e-03, 3.0329e-03, 1.3350e-02]],
...,
[[ 8.8496e-02, 2.4323e-02, 2.7986e-02],
[-5.4081e-03, -1.9963e-01, -5.1464e-02],
[ 1.1866e-02, -2.7785e-02, 2.1018e-01]],
[[ 4.0391e-02, 4.4052e-03, -4.4008e-03],
[ 1.4212e-02, -9.8625e-03, 1.3768e-02],
[-1.9370e-02, -2.1431e-02, 5.2087e-03]],
[[ 1.2320e-03, -2.8975e-02, -2.2249e-02],
[ 1.6791e-02, 8.4847e-02, 4.3377e-02],
[ 1.9656e-03, 1.9694e-02, -1.9070e-03]]],
[[[ 4.8440e-02, -4.4763e-02, -1.0791e-02],
[ 4.8726e-02, -4.4417e-03, 1.4742e-02],
[ 2.1477e-02, 8.6567e-03, -3.6082e-02]],
[[-3.8701e-02, -1.2715e-02, 1.2533e-02],
[-1.8730e-02, -6.0472e-02, -2.5189e-02],
[ 2.2671e-02, -2.5291e-02, -5.8667e-02]],
[[-3.7029e-02, -3.3881e-02, -2.2431e-02],
[ 9.7309e-02, -1.4381e-02, -1.7220e-02],
[ 1.1030e-01, -3.1160e-03, -1.9181e-02]],
...,
[[ 9.0160e-04, -1.1525e-02, 3.7523e-07],
[ 8.2968e-03, -1.3651e-02, -1.7352e-02],
[ 1.1171e-02, -5.9513e-03, 2.7555e-02]],
[[ 1.7079e-02, -2.1930e-02, 1.9103e-02],
[-2.6842e-02, -6.0972e-03, -1.4670e-03],
[-1.6494e-02, 4.2461e-02, 1.8720e-02]],
[[-1.8640e-02, 6.0099e-03, 2.8122e-02],
[-3.9789e-02, -4.6455e-02, 1.4855e-02],
[ 2.6166e-02, -2.5532e-02, -9.5414e-03]]],
[[[ 4.2654e-02, 4.9242e-03, 6.8797e-03],
[ 2.3528e-02, 9.4707e-03, 3.3961e-02],
[ 8.9612e-03, 1.0389e-02, 5.5256e-02]],
[[ 5.9566e-02, -2.1563e-03, 3.1213e-02],
[ 1.6754e-03, -7.2194e-03, 4.0085e-02],
[ 1.0832e-03, 5.8919e-02, 4.4493e-02]],
[[-1.2580e-02, 5.7713e-04, 1.0059e-02],
[ 1.2030e-02, 4.6987e-03, 3.1236e-03],
[ 8.9480e-04, 1.4357e-02, 2.6447e-02]],
...,
[[-4.4140e-03, 3.4141e-02, 4.5558e-03],
[ 4.9787e-02, 8.4149e-02, -2.7116e-02],
[ 2.4199e-02, -4.0261e-02, -8.1547e-02]],
[[-6.7713e-04, -2.7730e-02, -5.4919e-02],
[-1.9235e-02, -5.4257e-02, 1.2776e-02],
[ 1.8416e-02, 2.1117e-02, 2.6926e-02]],
[[ 5.5152e-04, -2.6003e-02, -9.0034e-03],
[-1.1525e-02, 1.6315e-02, -6.7735e-03],
[ 6.3495e-03, 4.3842e-02, 2.6865e-02]]],
...,
[[[ 2.4591e-03, -5.4010e-03, 2.8915e-02],
[ 4.4389e-02, -1.2923e-02, -1.4423e-02],
[ 2.8024e-02, 7.0375e-03, -7.5480e-03]],
[[ 7.8130e-03, 2.8320e-02, 3.5959e-02],
[-1.1365e-03, -3.2333e-03, 2.3726e-02],
[ 2.0827e-03, -3.6096e-02, 2.3465e-02]],
[[ 3.3042e-03, -1.6625e-02, 7.5050e-03],
[ 2.7083e-03, -1.7120e-02, -1.6971e-02],
[ 2.6469e-03, 3.6777e-03, -7.0093e-03]],
...,
[[ 1.9847e-02, 2.4702e-02, 9.8198e-03],
[-1.9358e-04, -8.7788e-04, 1.4243e-02],
[-2.8044e-03, 1.1196e-03, 1.4663e-04]],
[[-2.7229e-02, -6.1912e-02, -3.5770e-02],
[-6.0048e-02, -3.5057e-02, 3.2179e-03],
[-3.0292e-02, -1.6820e-02, 2.0466e-02]],
[[ 9.9859e-03, -5.6339e-02, -6.2947e-02],
[-3.0399e-02, -6.7334e-02, -4.3529e-02],
[-4.1354e-02, -5.3056e-02, -3.4299e-02]]],
[[[ 6.0791e-02, -1.4006e-02, -2.4178e-02],
[-8.9995e-02, 3.9476e-02, -1.1327e-02],
[ 3.0849e-02, -9.7647e-02, 4.5945e-02]],
[[ 1.3045e-02, 7.3334e-03, -4.1774e-02],
[ 6.9863e-03, -3.7512e-02, -2.4638e-02],
[ 1.1721e-02, 1.0189e-02, 3.6039e-03]],
[[-3.5600e-02, -2.1180e-02, 4.0662e-02],
[-1.2620e-02, 1.5114e-01, 3.7204e-02],
[ 5.0749e-02, 6.9440e-02, 8.3286e-02]],
...,
[[-1.6642e-02, -2.7068e-02, -1.0900e-02],
[-2.8749e-02, -7.3130e-03, 1.8015e-03],
[-3.6275e-03, -1.8182e-02, -1.4218e-02]],
[[-3.0965e-02, 7.0792e-03, 2.7343e-03],
[-1.2273e-02, -5.5871e-02, -3.8586e-02],
[ 7.3424e-03, -3.1108e-02, -1.4742e-02]],
[[ 1.6136e-02, 2.7342e-02, -4.0301e-03],
[ 6.3811e-03, -1.7503e-02, -4.0135e-03],
[ 3.1742e-02, 1.7379e-02, -6.7108e-03]]],
[[[-2.0414e-02, -9.1146e-02, -7.5917e-02],
[-6.0958e-03, -7.2738e-02, -4.7631e-02],
[ 1.2029e-02, -2.2230e-02, -2.5029e-02]],
[[ 1.9260e-02, 8.9474e-03, -1.1453e-02],
[-1.9365e-02, 1.0711e-02, 2.0501e-02],
[ 4.3463e-03, 2.5871e-02, 2.9356e-02]],
[[-3.7830e-02, 2.6095e-02, 1.8770e-02],
[-6.5372e-02, -3.3586e-02, 4.9805e-03],
[-5.4537e-02, -3.6168e-02, -2.8787e-04]],
...,
[[-1.2545e-03, -7.3686e-03, 1.9640e-03],
[ 5.7141e-02, 2.6122e-02, 5.1896e-02],
[ 2.2060e-03, -2.0150e-02, 2.1817e-03]],
[[-9.9409e-03, -5.6645e-03, 1.7466e-04],
[-5.0877e-03, -1.7477e-02, 2.8957e-04],
[-3.7206e-02, -4.8433e-02, -2.0188e-02]],
[[ 2.3961e-02, -1.3552e-02, -2.1773e-02],
[-2.2734e-02, -2.4259e-02, -3.0014e-02],
[-6.0345e-04, -4.3971e-02, -3.8277e-02]]]])), ('features.7.bias', tensor([ 0.0836, -0.0822, -0.0695, -0.0163, 0.4373, 0.0757, -0.0441, -0.0224,
-0.1218, 0.0133, 0.0878, 0.1036, -0.0422, -0.2496, 0.0257, -0.0795,
-0.1002, -0.0494, -0.0521, -0.1298, -0.1565, -0.0364, -0.0982, -0.1845,
-0.0191, -0.0344, -0.2473, 0.0965, 0.1396, -0.0756, 0.1497, 0.0961,
-0.0823, -0.2123, 0.1143, -0.0253, -0.2277, 0.0362, -0.0131, -0.1075,
-0.2306, -0.0403, -0.0982, -0.0019, -0.0630, 0.4740, -0.0289, 0.0884,
-0.1384, -0.0431, -0.0775, 0.0037, 0.0107, -0.2188, 0.0554, 0.4694,
0.0167, -0.0631, 0.2071, 0.1297, -0.2193, 0.1582, 0.1798, 0.0265,
0.1451, 0.0028, 0.0854, -0.0785, 0.0220, 0.0946, 0.0038, 0.0620,
-0.1214, -0.1129, 0.3075, 0.2730, 0.1583, 0.1078, -0.1199, 0.0878,
-0.1505, 0.0593, 0.2910, -0.0743, 0.1736, -0.1180, -0.1052, -0.1700,
-0.0358, 0.0251, -0.0407, 0.0319, 0.0409, 0.0217, 0.0929, 0.0265,
0.0082, -0.0883, -0.1289, 0.3025, 0.2867, 0.1037, 0.1550, 0.0487,
0.0779, 0.0229, -0.0395, -0.0438, 0.0225, -0.0228, 0.0140, 0.1090,
-0.0459, 0.1879, -0.0193, 0.0035, -0.0137, -0.1241, -0.0179, -0.1592,
-0.1170, -0.3014, 0.0201, -0.0681, 0.3935, 0.1075, 0.0768, 0.0884])), ('features.10.weight', tensor([[[[ 0.0355, 0.0499, 0.0509],
[-0.0655, -0.0615, -0.0344],
[-0.0273, -0.0449, 0.0184]],
[[ 0.0075, -0.0088, -0.0093],
[ 0.0125, 0.0303, 0.0279],
[-0.0358, 0.0183, 0.0161]],
[[-0.0237, 0.0058, -0.0199],
[ 0.0264, 0.0255, -0.0009],
[ 0.0273, 0.0237, -0.0038]],
...,
[[-0.0209, -0.0111, -0.0272],
[-0.0097, -0.0067, -0.0148],
[-0.0159, 0.0143, 0.0180]],
[[-0.0296, -0.0428, -0.0323],
[ 0.0446, -0.0007, -0.0018],
[ 0.0509, 0.0475, 0.0259]],
[[ 0.0006, 0.0491, 0.0092],
[ 0.0132, 0.0277, -0.0007],
[ 0.0058, 0.0559, 0.0480]]],
[[[-0.0119, -0.0015, 0.0425],
[ 0.0156, 0.0296, 0.0560],
[ 0.0044, 0.0488, 0.0339]],
[[-0.0091, -0.0404, -0.0055],
[-0.0149, -0.0365, -0.0328],
[-0.0309, -0.0243, -0.0334]],
[[ 0.0103, 0.0230, -0.0041],
[-0.0039, 0.0277, 0.0004],
[-0.0048, 0.0282, -0.0008]],
...,
[[ 0.0389, 0.0175, 0.0166],
[-0.0121, -0.0243, -0.0208],
[-0.0037, -0.0384, -0.0167]],
[[ 0.0100, 0.0057, -0.0100],
[ 0.0005, -0.0156, 0.0078],
[-0.0099, -0.0130, 0.0080]],
[[-0.0181, -0.0284, -0.0152],
[-0.0050, -0.0162, -0.0125],
[-0.0067, 0.0012, 0.0018]]],
[[[-0.0038, 0.0246, 0.0271],
[-0.0359, -0.0045, -0.0041],
[-0.0025, 0.0264, -0.0018]],
[[ 0.0314, -0.0155, -0.0211],
[ 0.0108, -0.0148, -0.0115],
[-0.0071, -0.0097, -0.0490]],
[[-0.0007, 0.0383, 0.0214],
[-0.0060, -0.0387, -0.0017],
[-0.0505, -0.0634, -0.0215]],
...,
[[-0.0266, 0.0052, 0.0248],
[-0.0182, 0.0044, 0.0088],
[-0.0185, -0.0195, -0.0229]],
[[ 0.0182, 0.0205, 0.0358],
[-0.0131, 0.0061, -0.0296],
[-0.0116, -0.0131, 0.0019]],
[[-0.0062, 0.0159, 0.0239],
[ 0.0573, 0.0447, 0.0309],
[ 0.0180, -0.0019, 0.0044]]],
...,
[[[-0.0183, -0.0454, -0.0171],
[ 0.0063, -0.0285, -0.0384],
[-0.0234, -0.0367, 0.0045]],
[[-0.0018, 0.1324, 0.0019],
[ 0.0092, -0.0549, -0.0624],
[ 0.0236, -0.1075, -0.0784]],
[[-0.0457, -0.0532, -0.0248],
[ 0.0067, -0.0130, -0.0129],
[ 0.0473, 0.0433, 0.0330]],
...,
[[ 0.0735, 0.0394, 0.0167],
[ 0.0261, -0.0100, -0.0021],
[-0.0338, -0.0655, -0.0056]],
[[ 0.2081, 0.0646, 0.0131],
[ 0.0006, -0.0904, -0.0618],
[-0.0264, -0.0573, -0.0456]],
[[ 0.0433, 0.0004, 0.0368],
[-0.0133, -0.0258, 0.0100],
[-0.0180, -0.0109, -0.0024]]],
[[[ 0.0105, 0.0148, 0.0044],
[ 0.0066, 0.0321, -0.0019],
[ 0.0376, -0.0034, -0.0033]],
[[ 0.0203, 0.0281, -0.0354],
[-0.0520, -0.0098, 0.0052],
[ 0.0041, 0.0039, -0.0096]],
[[-0.0169, -0.0127, -0.0054],
[-0.0052, -0.0013, -0.0105],
[-0.0119, -0.0099, 0.0036]],
...,
[[-0.0031, -0.0066, 0.0069],
[-0.0457, -0.0501, -0.0005],
[ 0.0017, -0.0303, -0.0319]],
[[ 0.0193, 0.0016, 0.0305],
[ 0.0265, -0.0370, -0.0125],
[ 0.0213, -0.0056, -0.0170]],
[[-0.0191, -0.0034, -0.0173],
[ 0.0442, 0.1396, -0.0159],
[-0.0545, 0.0043, 0.0072]]],
[[[-0.0284, -0.0141, 0.0013],
[ 0.0023, 0.0118, 0.0363],
[-0.0069, 0.0177, 0.0084]],
[[-0.0248, 0.0134, 0.0264],
[-0.0492, 0.0113, -0.0060],
[-0.0391, -0.0324, -0.0266]],
[[-0.0010, -0.0022, -0.0022],
[ 0.0041, 0.0137, 0.0023],
[ 0.0044, 0.0148, 0.0160]],
...,
[[-0.0215, 0.0181, 0.0368],
[ 0.0143, 0.0232, 0.0365],
[-0.0116, -0.0086, -0.0050]],
[[ 0.0445, 0.0202, -0.0302],
[ 0.0585, 0.0394, -0.0074],
[ 0.0442, 0.0171, -0.0291]],
[[-0.0355, 0.0111, -0.0028],
[-0.0198, 0.0184, -0.0035],
[-0.0190, 0.0018, -0.0114]]]])), ('features.10.bias', tensor([-5.7446e-02, 2.4570e-02, 9.0959e-02, -3.2109e-01, 1.1870e-02,
3.3711e-02, 2.3067e-02, -4.8484e-03, 1.2104e-01, 1.1503e-02,
2.0618e-03, 2.3955e-02, 4.0401e-02, 1.1837e-01, -7.5877e-02,
5.7691e-02, 1.5278e-01, 8.6564e-02, 1.5841e-01, -7.3258e-02,
-1.3988e-02, 5.0015e-02, -1.0287e-01, -2.7890e-02, 6.8270e-02,
-4.2715e-02, -1.4347e-02, 4.1650e-02, -1.1144e-01, -2.2445e-01,
-1.1803e-02, 6.3860e-02, -7.7212e-02, -5.5002e-03, -6.3036e-02,
9.1262e-03, 1.2340e-01, -1.4720e-01, -4.7101e-02, 8.2743e-02,
2.5719e-02, -5.2891e-02, 8.8058e-02, 9.2322e-02, -1.1663e-01,
-6.5487e-02, 1.0130e-01, 5.9148e-02, 9.1565e-02, 1.2468e-01,
-1.2594e-01, -6.2502e-02, 1.9690e-01, -1.8167e-02, 6.7858e-03,
1.1838e-02, -6.5850e-02, -7.8022e-03, 2.5793e-01, -1.1481e-01,
1.2569e-01, 7.2781e-02, 3.7666e-02, -2.0776e-01, 2.1917e-01,
5.9751e-03, -1.5813e-02, 1.7397e-01, -7.8578e-03, 1.5702e-01,
-1.4385e-02, 8.7721e-02, -5.6438e-02, -1.1731e-01, -2.3675e-02,
1.3250e-01, -3.3851e-02, -5.0050e-02, -9.7596e-02, -6.2945e-02,
1.6190e-01, -4.4989e-02, -8.6900e-02, -8.6632e-02, 1.4617e-01,
-4.1667e-02, -1.1800e-01, 9.0233e-03, 4.0534e-02, 8.0767e-03,
2.6678e-02, 2.2335e-02, 2.9483e-02, -1.5715e-02, -8.0130e-02,
7.5983e-02, -4.1338e-02, 2.8686e-02, 1.3600e-01, 2.3705e-01,
4.7630e-02, 5.7637e-02, -6.2161e-02, 2.0026e-02, 3.2948e-02,
6.6223e-02, 3.8356e-02, 3.7612e-04, -6.9782e-02, -6.2359e-02,
4.4190e-02, 1.7126e-01, -6.5330e-02, 2.3059e-01, 1.6632e-01,
-2.1705e-02, -8.7911e-03, -9.6434e-02, 1.2268e-02, -7.4404e-02,
1.0820e-01, 2.9947e-01, -3.0128e-02, -5.9969e-02, 1.2877e-01,
-3.0144e-02, 1.1260e-01, 1.0467e-02, -9.0089e-03, 1.5779e-01,
4.4415e-02, 1.2085e-01, 1.5854e-01, -1.1129e-01, 7.4208e-02,
-4.2408e-02, 2.4565e-02, -1.7885e-01, -7.2289e-02, 8.3184e-02,
-1.8653e-02, 2.4862e-02, 6.2039e-02, 1.1573e-01, 1.9001e-01,
3.0708e-02, -6.5719e-02, -9.2181e-02, -2.1442e-01, -9.2424e-02,
6.3944e-02, 3.6183e-02, 7.3988e-02, 1.1410e-01, 6.0153e-02,
-1.1364e-01, 2.2648e-01, 5.9450e-02, 1.7565e-01, 8.8490e-02,
8.7135e-03, 4.7490e-02, -5.7145e-02, 1.0770e-01, 1.8889e-01,
-1.8853e-02, -9.6265e-02, -1.1147e-02, -6.7515e-03, -4.7700e-02,
-6.3373e-02, -1.8629e-02, -1.2740e-02, 1.2157e-02, -1.1886e-01,
-3.6913e-02, -2.7622e-01, 8.9589e-03, 8.2538e-02, -1.8309e-02,
1.0119e-01, 6.0899e-03, 2.3091e-01, 1.0413e-01, 4.1931e-01,
8.9081e-02, 3.9540e-02, 1.8223e-01, 3.3824e-02, 4.4152e-02,
1.3374e-01, -2.9602e-02, 1.4168e-01, 1.4195e-02, -3.1792e-02,
-1.5169e-01, 1.5414e-01, 1.6415e-01, 5.5740e-03, 5.9057e-02,
-2.4470e-02, 2.3066e-01, 2.4892e-02, 1.0200e-01, 6.4039e-02,
1.5329e-02, 4.4151e-02, 2.9930e-02, 2.7873e-01, -5.1942e-02,
4.6895e-02, -4.5190e-02, -3.8270e-02, -5.8192e-02, -1.1013e-01,
-4.8545e-02, -3.7923e-02, 6.5177e-02, -1.2972e-01, -7.8541e-02,
2.2149e-01, -3.9742e-02, -9.1505e-02, -4.1990e-03, -2.7244e-03,
1.4357e-01, 1.4882e-01, -4.2019e-02, -1.1832e-03, 6.4206e-02,
1.4984e-01, 9.5846e-02, 4.1858e-03, -9.0044e-02, -9.8628e-04,
1.4918e-01, 3.0143e-02, 8.0723e-02, 3.5842e-01, 1.9044e-03,
3.0532e-02, 2.7621e-02, 1.6963e-01, -1.3716e-02, 1.4279e-01,
4.7688e-02, 9.5105e-02, 2.9516e-02, 6.2220e-02, 9.7986e-02,
-2.6807e-01, 6.1009e-03, 1.0210e-01, -5.5891e-03, -2.0582e-01,
2.8513e-01])), ('features.12.weight', tensor([[[[ 2.2190e-02, 9.3375e-03, -2.3806e-03],
[ 7.9333e-04, 3.1214e-03, -4.1305e-04],
[ 1.2009e-02, -6.1337e-03, 1.7373e-02]],
[[-1.7068e-03, 8.7343e-03, -1.9785e-02],
[-8.7037e-04, 1.2904e-02, -1.5547e-02],
[ 5.6292e-03, 3.7641e-03, -8.1919e-03]],
[[ 1.0456e-02, 1.5558e-02, -1.6826e-02],
[-2.6303e-02, 2.4462e-03, -2.7086e-03],
[-3.6940e-02, -9.0857e-03, 9.6455e-03]],
...,
[[ 1.3195e-02, 8.9371e-03, -1.9751e-02],
[ 4.6812e-03, 3.4504e-03, -2.7716e-03],
[ 2.9848e-03, -1.0504e-02, -1.0525e-02]],
[[ 1.6012e-02, -3.9771e-03, 6.3396e-05],
[ 2.3741e-02, 9.2671e-03, 1.5559e-02],
[ 1.3264e-02, 1.7373e-03, 1.2447e-02]],
[[-2.5741e-03, -1.1160e-03, -1.9425e-03],
[-6.2923e-03, -1.5996e-02, -1.3790e-02],
[ 3.4548e-03, -8.0707e-03, -1.9676e-02]]],
[[[-1.7457e-02, -6.2276e-03, 1.0153e-02],
[-1.5329e-02, 6.4883e-03, 2.2166e-02],
[ 9.0666e-03, 4.1839e-03, 1.8104e-03]],
[[ 1.1901e-02, -7.3339e-03, -4.9155e-04],
[ 1.3219e-02, -8.1386e-03, -3.6626e-03],
[ 2.6162e-02, -3.3781e-03, 1.0180e-03]],
[[ 1.1008e-03, -5.7245e-03, 2.8854e-02],
[ 2.6064e-02, 2.1475e-02, 1.5367e-02],
[ 2.8949e-02, 2.0124e-02, 3.1002e-03]],
...,
[[-2.3294e-02, 5.1476e-03, -6.7791e-03],
[-9.0801e-03, -1.9397e-02, -1.3953e-02],
[-1.7359e-02, -5.8093e-03, 7.8484e-03]],
[[-4.4599e-02, 1.0302e-02, -1.4375e-02],
[-3.5310e-02, 1.7527e-02, -2.6089e-02],
[ 1.2913e-03, 3.1758e-02, -1.4552e-03]],
[[-4.6345e-03, 2.5216e-03, -2.3105e-02],
[ 2.1097e-02, 1.2624e-02, -9.0503e-04],
[ 2.7682e-02, 8.1938e-03, -1.5858e-02]]],
[[[-1.7498e-02, 3.0911e-03, 4.0389e-03],
[-2.2586e-02, -7.0206e-03, -2.8252e-03],
[-3.0824e-03, -7.1408e-03, -1.3011e-02]],
[[ 7.0758e-03, 4.7984e-03, -7.1330e-03],
[ 3.3436e-02, 2.8854e-02, 3.2068e-03],
[ 1.7009e-02, 2.2667e-02, 2.2242e-02]],
[[-2.6273e-02, 8.4086e-03, 2.5916e-02],
[-2.7218e-02, -2.5646e-02, 4.8167e-03],
[ 2.1106e-03, -2.8939e-02, -1.3618e-03]],
...,
[[ 3.8196e-02, 6.6969e-03, -3.3032e-02],
[ 3.9109e-02, 7.0522e-02, 6.5660e-02],
[ 2.3385e-02, 8.4650e-02, 6.0362e-02]],
[[ 1.6442e-03, 6.8669e-02, 6.3953e-03],
[-5.9532e-02, -4.7040e-03, -2.2699e-02],
[-4.7889e-02, -1.2805e-04, 1.4766e-03]],
[[ 5.7099e-03, 1.2850e-02, -1.4737e-02],
[ 1.0043e-02, 1.4514e-02, -7.4504e-03],
[-7.8473e-03, 1.7984e-02, 7.1292e-03]]],
...,
[[[-7.2858e-03, 1.5836e-02, -8.9018e-03],
[-2.6001e-02, -2.8088e-02, -3.8419e-02],
[-6.1428e-03, 1.1000e-02, 1.0940e-02]],
[[ 3.3946e-02, 2.2524e-02, 3.1289e-02],
[-8.1132e-03, -1.8785e-02, 4.0857e-03],
[-1.0905e-02, -2.9373e-02, -8.3019e-04]],
[[ 2.0543e-02, 3.4575e-03, 3.5647e-02],
[ 3.3855e-02, 1.7190e-02, 8.5639e-03],
[ 2.2449e-02, 4.3584e-02, 2.7137e-03]],
...,
[[ 1.7171e-02, 2.5890e-02, 3.3639e-02],
[ 1.3982e-02, 8.4238e-03, 8.0006e-03],
[ 2.1766e-02, 1.4913e-02, 2.8352e-02]],
[[-1.0011e-02, 1.5742e-02, 8.0610e-04],
[-3.9899e-02, 5.9374e-03, -3.5478e-02],
[-3.8416e-02, -1.6607e-02, -1.2068e-02]],
[[ 3.4733e-02, 3.8215e-02, 3.9510e-04],
[ 1.8873e-02, 1.4733e-02, 6.0042e-04],
[ 9.3505e-03, 2.1432e-02, 8.0231e-03]]],
[[[-3.4877e-02, -8.7256e-03, 2.0653e-02],
[-3.4053e-02, -4.3450e-03, 4.5571e-02],
[-1.3813e-02, -8.1692e-03, 1.7117e-02]],
[[-9.0386e-04, 1.0830e-02, -4.2244e-03],
[-1.5450e-02, -2.0254e-03, -2.2527e-02],
[-3.9511e-02, -2.6893e-03, -1.1160e-02]],
[[ 1.5799e-02, -1.1858e-02, 1.0223e-02],
[ 1.5443e-02, 1.5404e-02, 2.6426e-02],
[ 1.9375e-02, -3.5508e-03, 3.5140e-03]],
...,
[[ 1.6287e-02, -8.6022e-03, -1.3193e-02],
[ 5.4952e-04, -5.2909e-03, 5.9362e-03],
[-6.0901e-03, 2.1227e-02, -2.0793e-03]],
[[-3.2922e-02, -3.4984e-02, 3.4344e-02],
[-2.3168e-02, -3.0376e-02, 3.8597e-02],
[ 3.5357e-02, -2.1286e-02, -1.9970e-02]],
[[ 2.1006e-02, -1.4642e-02, -3.2151e-02],
[ 2.8518e-02, 9.7746e-04, -3.4977e-02],
[ 3.8455e-02, 9.7505e-03, -2.4813e-02]]],
[[[ 9.4910e-03, 1.9250e-03, -1.9075e-02],
[ 3.5648e-02, -2.0767e-02, -2.1764e-02],
[-1.1669e-03, 2.0074e-03, -1.7276e-03]],
[[-9.8238e-03, 6.2034e-04, 1.5453e-02],
[-1.9473e-02, -2.1708e-02, 2.2915e-02],
[ 2.2752e-02, 3.5843e-03, 6.9153e-03]],
[[-1.2752e-02, 4.1273e-03, 1.3468e-02],
[-5.2075e-02, -6.9341e-03, -9.7277e-03],
[-3.2709e-02, 2.1979e-03, 1.6788e-02]],
...,
[[-1.0954e-02, -2.6567e-02, 9.4364e-03],
[ 3.0442e-02, 1.4641e-03, 4.1114e-03],
[ 9.9419e-03, 9.3903e-03, -1.3467e-02]],
[[ 2.7166e-02, -2.1132e-02, -3.4153e-02],
[ 7.3596e-02, 2.6811e-04, -3.5622e-02],
[-5.1473e-03, 2.2438e-02, -1.8705e-03]],
[[ 2.1269e-02, 3.2318e-02, 1.4691e-02],
[ 4.7579e-02, 4.3891e-02, 4.2584e-03],
[ 3.4317e-02, 1.6485e-02, 1.5579e-02]]]])), ('features.12.bias', tensor([-1.6012e-01, -5.0735e-02, -3.7225e-02, 1.5612e-01, 6.2651e-02,
9.1161e-02, -9.1950e-02, -6.1958e-02, -1.0152e-01, 2.7419e-02,
8.4163e-02, 4.0658e-02, 2.2090e-01, 3.6786e-01, -6.0822e-02,
4.7705e-01, 2.1508e-02, -3.6398e-02, 1.5033e-01, -4.0052e-02,
1.4129e-01, 1.0264e-01, 6.4401e-02, 3.6530e-02, -7.9153e-02,
1.1684e-01, 1.7219e-01, 5.9786e-02, 5.5542e-02, -1.4757e-01,
2.0579e-02, 7.8264e-02, -2.0459e-02, 8.7215e-02, 1.5873e-01,
2.0761e-02, 1.1587e-01, 6.8712e-02, 1.0054e-01, 4.6842e-02,
5.3803e-02, -1.7300e-01, 1.0532e-01, -9.2736e-03, -2.9297e-03,
-2.2108e-01, 2.3624e-01, -1.5397e-01, 6.4517e-02, 2.6478e-02,
-1.2072e-01, 1.8867e-02, 6.4691e-03, 1.3244e-02, 4.5646e-02,
2.7232e-01, -3.7951e-02, 1.2576e-02, 4.0468e-02, -3.6800e-03,
-9.4446e-02, 3.1483e-02, -1.2433e-03, 9.4844e-02, -1.4001e-02,
-5.7131e-02, 1.2919e-01, 1.3817e-01, 4.7747e-02, 3.9263e-02,
-1.8301e-02, 1.1912e-01, 1.5127e-01, -1.0465e-02, 2.1670e-01,
-1.1868e-02, -7.4624e-02, -2.7190e-02, 2.4109e-01, -7.6152e-02,
-2.4461e-02, 8.5809e-02, 2.6740e-01, 1.3322e-02, -4.7746e-02,
3.5034e-02, 6.4532e-02, -2.1023e-02, 1.5194e-02, 1.0387e-01,
-5.0722e-02, 8.3656e-03, -1.0415e-01, 1.4376e-02, 3.6262e-02,
8.5766e-03, 4.4741e-02, 4.5479e-02, 2.9686e-01, -1.6799e-01,
5.0258e-02, 2.1579e-01, -4.2610e-03, 2.7685e-01, -4.6296e-02,
-5.0471e-02, 1.1650e-01, -1.1389e-01, -1.6555e-01, 9.5928e-02,
1.2904e-02, 1.5499e-01, -1.4816e-03, -4.5260e-02, 7.0139e-02,
1.8186e-01, -1.2745e-01, 1.6226e-02, 2.3446e-03, -5.0818e-02,
-4.9849e-02, 8.2467e-02, -3.1306e-02, -4.6938e-02, 1.3635e-01,
-3.4978e-02, 1.2757e-01, -7.2327e-02, 4.0631e-02, 8.8918e-02,
-2.0051e-02, 1.1981e-01, -3.5570e-02, 6.6117e-02, 2.9789e-01,
-5.5040e-02, 1.3873e-01, 7.6603e-02, 7.6449e-02, -1.6732e-03,
7.8121e-02, 1.5447e-02, 2.5804e-02, -1.9870e-01, -2.6819e-02,
5.6701e-02, -2.1692e-02, -2.2948e-02, 9.4408e-02, 1.5145e-01,
-1.8660e-01, 5.6927e-02, 6.0845e-02, -1.0473e-01, 4.6626e-02,
1.8027e-02, 2.0851e-02, 8.6570e-02, 8.4838e-02, 7.3159e-02,
-4.2122e-02, 1.9796e-01, 1.3179e-01, 2.0714e-01, 6.6279e-02,
4.4824e-02, -1.4769e-01, -5.9476e-02, -5.4382e-02, 2.1701e-02,
2.3693e-01, 3.8262e-02, 1.1462e-01, 3.5898e-02, 8.4504e-02,
-3.6447e-01, -3.9428e-02, 1.2056e-01, 2.4456e-02, 2.5343e-03,
8.7984e-02, -2.0897e-02, 7.6671e-03, -1.2385e-02, -2.7026e-02,
-7.1514e-02, 1.2748e-01, -6.3203e-02, -1.0576e-01, 1.2725e-01,
2.4837e-02, -5.7470e-02, 7.4145e-02, 2.5202e-02, 4.0945e-02,
3.4231e-02, 6.6116e-02, -2.0097e-01, -5.1009e-02, 1.3706e-01,
-4.8891e-02, 2.7549e-02, 3.6732e-02, 6.8057e-02, -1.3434e-01,
-8.9797e-03, -1.5936e-02, 2.0673e-01, -1.4660e-01, -3.5366e-02,
2.2485e-03, 1.3222e-01, 5.1675e-02, -2.9735e-02, 1.6665e-03,
2.6343e-02, 6.8851e-02, 3.4044e-02, -9.8308e-03, 7.6839e-02,
-1.4244e-01, 1.4513e-01, 1.8962e-01, -5.2993e-02, -7.1651e-02,
3.8192e-02, 2.1461e-01, 1.3760e-01, 1.2501e-01, -1.1683e-01,
8.7743e-02, 7.0001e-02, 2.9863e-01, -6.2018e-02, 3.1917e-02,
1.1081e-01, 3.1610e-02, -1.8750e-01, -4.6551e-02, -4.4282e-02,
-4.7255e-02, 1.5423e-02, -2.8847e-02, 5.8926e-02, 1.2185e-01,
-7.5157e-02, 6.0321e-02, 5.7117e-02, 1.3085e-04, -6.9822e-02,
1.6722e-01, -6.3790e-02, 7.7138e-02, 7.0001e-02, -9.7195e-02,
1.0346e-01])), ('features.14.weight', tensor([[[[-2.2149e-02, -8.2414e-03, 9.6534e-03],
[-1.3779e-02, -6.8542e-03, 7.5707e-03],
[-2.4095e-03, 2.4156e-02, 1.3655e-02]],
[[-1.6860e-02, -1.0794e-02, 1.7457e-02],
[-2.1419e-02, -1.2860e-02, 3.4045e-02],
[-2.6813e-02, -3.2066e-02, 6.4252e-03]],
[[ 2.4645e-02, 6.3990e-02, 3.2102e-02],
[-1.8808e-02, 4.6972e-02, 2.0714e-02],
[-3.9489e-02, -5.4023e-02, 5.1483e-03]],
...,
[[-2.4233e-02, -3.5629e-02, -1.6502e-02],
[-9.8831e-03, -2.8307e-02, -3.1861e-02],
[-2.0740e-02, -1.8873e-02, -3.5950e-02]],
[[-1.4095e-02, -2.7954e-02, -1.6467e-02],
[ 3.5940e-03, -4.0346e-02, -6.5416e-03],
[ 5.5689e-03, -6.9456e-03, 2.2095e-02]],
[[ 1.6124e-02, -2.5622e-04, -2.0933e-02],
[-1.2210e-02, 5.8796e-03, -2.6578e-02],
[ 2.4079e-02, 3.7444e-02, -6.3200e-04]]],
[[[ 1.8710e-02, 2.8985e-02, 1.1611e-02],
[ 9.5369e-03, 5.7986e-03, -4.6419e-03],
[-3.5390e-03, -3.2317e-03, -2.5634e-02]],
[[-2.4254e-03, -6.8106e-03, 1.8674e-02],
[-1.5460e-02, -1.5269e-02, -3.3920e-03],
[-7.7486e-03, -1.1479e-02, 2.3092e-03]],
[[ 2.5946e-02, 1.7179e-02, -5.0896e-03],
[ 2.9008e-02, 4.2745e-02, -4.6755e-03],
[ 1.9021e-02, 4.8730e-02, 1.6705e-02]],
...,
[[-9.2972e-03, -5.6574e-04, -1.8657e-02],
[-1.2496e-03, 1.5984e-02, 2.1087e-02],
[-1.2462e-02, -9.7697e-03, -3.3838e-03]],
[[ 1.1037e-02, -6.6580e-03, -1.6569e-02],
[-3.4399e-02, -3.8640e-02, -1.6853e-02],
[-2.8134e-02, -4.0835e-02, -2.1695e-02]],
[[-2.2043e-02, -2.2717e-02, 1.2967e-03],
[-1.0877e-02, -2.1517e-02, 3.7078e-03],
[-1.2117e-02, -1.8376e-02, -2.6302e-02]]],
[[[-4.2283e-03, 2.7115e-02, -5.0168e-03],
[ 6.9854e-03, 2.4553e-02, -1.6055e-02],
[ 1.8330e-02, 1.4279e-02, 9.6839e-03]],
[[-1.1661e-03, -6.3398e-03, -2.2814e-02],
[-5.3035e-04, 1.3962e-02, 2.1199e-02],
[-1.6242e-02, 1.0420e-02, -9.6910e-03]],
[[ 3.0347e-02, 4.4854e-03, 1.1800e-02],
[ 2.5272e-03, -3.2939e-02, -1.0493e-02],
[-3.3863e-02, -1.0100e-02, -8.8865e-04]],
...,
[[-2.3229e-02, -2.6648e-02, 6.1200e-03],
[-2.9659e-02, -1.9713e-02, 4.8856e-02],
[ 1.4367e-02, 3.0869e-02, 3.0228e-02]],
[[-6.2127e-02, -6.8233e-02, -2.4166e-02],
[ 2.1200e-02, 1.2648e-01, -1.2569e-02],
[ 6.1253e-03, 5.8688e-02, -4.4977e-02]],
[[-9.6391e-03, 5.9208e-03, -4.9147e-02],
[-3.1329e-03, 3.0318e-03, -2.0502e-03],
[-9.0497e-03, 5.4860e-03, 4.1482e-03]]],
...,
[[[-3.0342e-02, -3.6778e-02, -1.7676e-02],
[-1.3094e-02, -2.2813e-02, -7.9857e-05],
[ 4.2266e-03, -1.9588e-03, 5.2140e-03]],
[[-1.9099e-02, -1.4363e-02, -2.7928e-02],
[-8.7775e-03, -1.3442e-02, -2.1776e-02],
[-3.5407e-02, -2.4818e-02, -1.2660e-02]],
[[ 1.2218e-02, 4.1714e-03, 1.0525e-02],
[-6.5192e-03, -4.0068e-04, 4.3611e-02],
[-2.5296e-02, -1.8979e-02, 2.7715e-02]],
...,
[[ 3.7842e-02, 3.9178e-02, 1.0313e-02],
[-9.9228e-03, -1.0001e-02, -3.7412e-02],
[-6.3894e-03, -2.4837e-02, -3.9814e-02]],
[[-5.2920e-03, -1.6280e-03, 6.7065e-03],
[ 1.4767e-02, 1.8472e-03, -4.6330e-03],
[ 3.9676e-03, -3.8436e-03, 1.8462e-05]],
[[ 2.0358e-02, 2.2053e-02, -1.3662e-02],
[-1.2793e-02, 6.0091e-03, -1.8806e-02],
[-4.3575e-02, -2.6307e-02, 2.1265e-02]]],
[[[-3.0884e-03, 2.0031e-03, -1.6961e-03],
[-1.2236e-02, 7.7159e-04, -4.9329e-03],
[-1.4896e-02, -3.0245e-04, 1.0504e-02]],
[[-1.2694e-02, -1.3070e-02, 1.2268e-02],
[-1.2225e-02, -1.4629e-02, 3.7937e-03],
[-1.8198e-03, -8.7697e-03, 1.8177e-02]],
[[ 4.5653e-03, -4.4324e-02, -3.7059e-02],
[ 2.4033e-02, 5.9055e-02, -3.1115e-02],
[ 1.8395e-02, 6.7375e-02, -1.4521e-02]],
...,
[[-6.6661e-03, 1.2244e-02, 1.0125e-02],
[-1.7618e-02, -2.1308e-02, -5.1447e-03],
[-1.8775e-02, -1.8974e-02, -2.6915e-02]],
[[-4.2123e-03, 1.1425e-02, -2.1308e-03],
[ 5.0488e-03, 1.6919e-02, 3.2309e-02],
[ 2.2250e-02, 2.3884e-02, 7.3293e-03]],
[[ 2.5346e-02, -2.4656e-03, 2.1781e-02],
[ 8.1773e-03, -2.5338e-02, -6.2149e-03],
[-9.1320e-03, -7.8748e-03, 6.5382e-03]]],
[[[ 5.7087e-03, -6.2018e-03, 5.5422e-05],
[-3.6819e-03, 8.8544e-03, -4.6847e-03],
[-2.7346e-03, -2.1267e-03, 4.9169e-03]],
[[-7.9532e-03, -1.1012e-02, -1.7147e-02],
[-1.3081e-02, 4.0728e-04, -5.2604e-04],
[-6.0302e-03, 1.8281e-02, 2.0273e-02]],
[[-2.1117e-03, -3.7656e-02, -2.3536e-02],
[ 4.2944e-03, -2.6739e-02, -2.5058e-02],
[ 3.6964e-03, -1.2538e-02, -3.5994e-02]],
...,
[[-1.6156e-02, -7.3295e-03, -2.2457e-02],
[-9.6228e-03, -1.9578e-02, -2.0918e-02],
[-1.6559e-02, -8.8631e-03, -6.2591e-03]],
[[-1.8987e-02, -2.1317e-02, -4.0958e-02],
[-1.7522e-02, -3.6686e-02, -3.4363e-02],
[-2.8595e-02, -3.5819e-02, -1.8181e-02]],
[[ 1.1593e-02, 1.0623e-02, -1.3901e-02],
[ 2.2684e-02, 2.2538e-02, 1.7627e-02],
[ 9.7939e-03, 1.5285e-02, 3.0524e-02]]]])), ('features.14.bias', tensor([ 2.3224e-01, -2.4138e-04, 6.4960e-02, -4.2017e-01, 6.0453e-02,
9.8442e-02, 4.6449e-02, -7.3136e-03, -1.8736e-02, -1.9464e-02,
7.8066e-02, 8.2655e-02, 1.2813e-01, 8.7396e-02, 1.7907e-01,
1.3942e-01, 2.0888e-02, -8.0227e-03, -4.0741e-02, 1.1292e-02,
2.2233e-02, 2.7878e-02, -2.5189e-01, -3.2148e-01, 1.1135e-01,
-5.7587e-03, -8.3068e-03, -6.6311e-02, 1.1230e-01, 1.1002e-01,
2.5989e-02, 4.4331e-02, -9.7547e-02, 1.0856e-01, 3.5750e-03,
5.6862e-02, 9.4468e-02, 1.4510e-02, -1.3779e-01, 8.2087e-02,
7.0894e-02, 1.3279e-01, 3.9849e-02, 1.6688e-01, 1.8628e-01,
5.3910e-02, 8.5733e-02, -1.6489e-02, 1.0809e-02, 4.9333e-02,
1.8078e-01, -4.7200e-02, 5.0713e-02, -7.0014e-02, -2.5531e-02,
-1.1994e-01, -5.9129e-02, 3.7465e-02, 4.3772e-02, 5.7414e-04,
6.9726e-02, 1.9478e-01, 2.8760e-03, -2.2992e-02, -7.7390e-02,
-5.2995e-02, 2.2579e-02, 3.6237e-02, -6.6670e-02, 7.1317e-02,
9.1155e-02, 2.9420e-02, 1.5036e-02, 8.8521e-02, -1.2923e-02,
2.4080e-01, -8.0947e-02, -3.8936e-02, 8.8143e-02, 3.5808e-02,
1.5534e-01, -4.5179e-02, 1.8793e-03, 3.6875e-02, -3.6595e-02,
2.0280e-02, -5.3748e-02, -8.6558e-02, 1.3941e-01, 3.0854e-02,
2.9668e-02, -2.4416e-02, 2.6000e-03, -7.3550e-02, 1.4768e-01,
-2.1652e-01, 5.3047e-02, 1.1851e-01, -4.8188e-02, 1.4722e-02,
-9.4306e-03, -5.0296e-02, 1.4726e-01, 7.1685e-02, 4.4254e-02,
6.7750e-02, 1.1392e-02, -6.5454e-02, 8.9793e-02, -6.5129e-03,
-8.9897e-02, 4.6557e-02, 3.1321e-02, 1.7463e-01, 9.7366e-02,
6.6568e-02, 2.3548e-02, 2.6455e-01, 8.2348e-02, 1.0178e-01,
-6.8253e-02, -1.8821e-02, 1.4865e-01, 2.3880e-01, -8.9092e-02,
1.4677e-01, 4.6342e-02, 1.3640e-01, 2.3311e-01, -5.7933e-01,
-5.8695e-02, 1.5480e-01, 9.7374e-02, 7.1892e-02, 2.8998e-02,
6.9282e-02, 5.9374e-02, 9.2056e-02, -2.1834e-02, 1.1740e-01,
-2.0650e-02, -7.8520e-02, 2.0102e-03, -2.0359e-02, 6.5492e-02,
2.3471e-01, 8.3693e-02, 1.0269e-01, 2.9251e-02, -4.6842e-03,
1.9002e-01, 1.2875e-02, 2.1704e-01, 1.1589e-01, -3.2832e-03,
-4.5286e-02, -1.3415e-02, -3.4149e-02, 2.7222e-01, -1.7798e-02,
-3.9128e-02, 7.0385e-02, -7.2472e-03, 3.2494e-01, 1.4990e-01,
1.7056e-01, 7.0425e-02, -1.9899e-02, 7.3611e-02, -2.3791e-02,
5.1146e-02, 1.2544e-02, 2.8663e-01, 1.9698e-01, -6.8813e-02,
1.1687e-01, -7.8582e-02, -6.9537e-02, 1.6115e-03, 9.7725e-02,
6.8310e-02, 7.5665e-02, 1.3851e-01, 7.0042e-02, 1.8399e-01,
2.2156e-01, 3.1850e-02, -2.7991e-03, 1.2451e-01, 7.0588e-02,
3.0229e-02, 6.0480e-02, 1.1667e-01, 1.5710e-01, 1.1263e-02,
-2.9042e-01, 9.3900e-03, 6.0581e-03, 8.3776e-02, 7.1657e-03,
6.3825e-02, -5.3517e-02, -1.0420e-01, 1.8267e-01, 2.5337e-02,
-5.7360e-04, -8.6966e-02, 4.4366e-02, 1.9753e-02, -8.9814e-02,
9.9245e-02, 1.0041e-01, 1.1926e-01, 2.2637e-01, -4.1234e-03,
-9.3399e-02, 4.1874e-02, 1.7009e-01, 8.4081e-02, -2.9981e-02,
1.3411e-01, 7.9522e-02, 1.1152e-01, -1.9762e-02, 2.3515e-01,
-5.3482e-03, 7.8955e-02, 2.0268e-01, 7.5398e-02, 1.2858e-01,
3.6243e-02, -2.2149e-01, -3.2433e-03, 1.2848e-01, 5.3996e-02,
1.4744e-01, 1.6092e-01, 1.0675e-01, 8.5267e-02, 1.5651e-01,
6.2202e-02, 1.8605e-02, -1.1282e-01, 1.1531e-01, 4.3558e-02,
6.8836e-02, 1.0267e-01, 2.3824e-02, -1.1741e-02, 1.6236e-01,
1.8433e-01, 1.4847e-01, -4.3660e-02, 1.3364e-01, -4.5390e-02,
2.5698e-02])), ('features.16.weight', tensor([[[[-2.0054e-02, 4.5959e-02, 2.4285e-03],
[-1.0683e-02, 5.2972e-03, 3.7511e-03],
[-2.4106e-03, -1.2168e-02, 1.0104e-02]],
[[-1.2632e-02, -2.5883e-02, -8.5592e-03],
[-1.2386e-02, -5.5469e-02, -4.8311e-02],
[-1.6625e-02, -3.7525e-02, -2.6493e-02]],
[[ 3.8509e-02, 3.7259e-02, -1.4662e-02],
[ 5.7820e-02, 3.1240e-02, -5.4511e-03],
[-1.6632e-02, -6.7681e-03, 1.3355e-02]],
...,
[[-8.8624e-03, 2.4756e-02, 3.0510e-02],
[-2.1363e-02, 2.0381e-02, 3.5623e-02],
[-1.8509e-03, 3.5199e-03, -1.2715e-02]],
[[ 3.4045e-03, -1.9371e-02, -1.0391e-02],
[ 3.0154e-03, -2.5516e-02, -2.0094e-02],
[ 1.4827e-02, 2.7095e-03, 6.5585e-03]],
[[-1.2619e-02, 6.1968e-04, 9.2541e-03],
[-1.9611e-02, 1.9820e-02, 3.8648e-03],
[-1.2689e-02, 6.4709e-03, -7.3962e-03]]],
[[[ 1.3729e-02, 1.4296e-02, -6.1864e-03],
[ 5.7469e-02, 1.7736e-02, 1.3869e-02],
[ 3.0767e-02, 1.9137e-02, -6.7595e-03]],
[[-1.6647e-02, -4.1772e-02, 4.1301e-03],
[-1.0602e-02, -2.6179e-02, 1.1898e-02],
[-8.0377e-03, -2.4341e-02, 3.9841e-03]],
[[ 3.3982e-03, 2.2854e-02, 6.2299e-03],
[-5.5849e-03, -4.2709e-03, -9.4620e-03],
[ 8.8606e-03, -1.2502e-02, -2.0650e-02]],
...,
[[-3.3839e-03, 2.0091e-03, -3.5215e-03],
[-1.6417e-02, -4.3874e-03, -1.2504e-02],
[-1.4373e-02, -3.7632e-03, -2.2013e-02]],
[[ 1.6351e-02, 4.8943e-02, 1.6001e-02],
[ 1.5020e-02, 3.7409e-02, 1.4995e-02],
[ 2.1674e-02, 1.2354e-02, 1.4808e-03]],
[[ 1.1995e-02, 5.3228e-03, 7.3681e-03],
[ 1.1077e-02, 9.1181e-03, 1.7539e-03],
[ 4.4813e-02, 5.0831e-03, 1.8113e-03]]],
[[[-2.0534e-03, 2.2168e-03, -1.7295e-03],
[-1.2071e-02, -1.8105e-02, -1.0962e-02],
[ 4.5020e-02, -3.1528e-02, -2.2790e-02]],
[[-8.2106e-03, -1.3699e-02, -1.2184e-02],
[-7.9822e-03, -1.2707e-02, 8.0432e-03],
[ 4.1536e-03, -9.5290e-03, 1.1782e-02]],
[[-1.7297e-02, -2.0688e-02, 1.3545e-02],
[-3.6407e-03, -1.2636e-02, -1.2658e-02],
[ 1.0540e-02, -4.8738e-03, -1.1150e-02]],
...,
[[-1.4873e-03, 1.5673e-02, 3.9235e-03],
[ 3.7199e-03, 3.2357e-02, 3.2763e-02],
[ 3.8112e-03, 2.7447e-02, 2.4343e-02]],
[[ 6.5385e-03, -1.2451e-02, -9.8216e-03],
[ 4.5049e-03, -2.5117e-02, -2.9441e-02],
[ 1.6662e-02, 5.0969e-03, -3.1925e-02]],
[[-3.0679e-04, 2.1420e-03, 4.5710e-04],
[ 1.1923e-02, 6.3044e-03, 5.3676e-03],
[ 6.3319e-03, 1.5382e-02, 1.3769e-03]]],
...,
[[[-1.0196e-02, -1.5307e-02, -6.5358e-02],
[ 6.4789e-03, -6.2395e-03, -5.9611e-02],
[ 9.7203e-03, 3.6012e-02, -1.8961e-02]],
[[ 1.4425e-02, 9.5859e-03, -3.2877e-02],
[ 8.3048e-03, -2.5427e-02, -1.8519e-02],
[ 2.1592e-02, -3.1290e-02, -1.0763e-02]],
[[ 3.5889e-04, -7.7981e-03, -1.2154e-02],
[-1.5725e-02, -1.4608e-02, -1.4470e-02],
[-8.8080e-03, 2.3435e-02, 2.6203e-02]],
...,
[[ 1.1523e-02, -1.4577e-02, 1.6442e-03],
[-1.5564e-02, -1.8016e-02, -3.3473e-03],
[-8.3181e-03, -2.0492e-02, -3.5564e-06]],
[[-4.1408e-03, 5.0721e-03, -4.4403e-02],
[ 3.9026e-03, -1.0039e-02, -5.6778e-02],
[ 1.9235e-02, -4.8915e-02, -4.8355e-02]],
[[-1.2534e-02, -2.9425e-02, -1.7007e-02],
[-9.3814e-03, 1.0269e-03, -6.9089e-03],
[ 6.1900e-03, 3.2867e-03, 2.5767e-02]]],
[[[-1.1315e-02, -3.1747e-02, -4.9764e-02],
[-3.0308e-03, -5.8178e-03, -2.9852e-02],
[ 2.2658e-02, 1.7780e-02, -1.8609e-02]],
[[-8.6915e-03, -6.4885e-03, -1.4613e-03],
[ 9.4756e-03, -2.4436e-02, -1.9849e-02],
[-4.1832e-03, -6.4834e-03, -1.8950e-03]],
[[ 3.0659e-03, 1.5401e-02, 2.3925e-02],
[-8.9226e-03, -1.0473e-03, 1.0635e-02],
[-7.0022e-03, 9.0982e-03, 1.9060e-02]],
...,
[[ 6.3793e-03, 1.1504e-02, -1.8065e-02],
[ 1.1370e-02, -2.9965e-03, -4.6035e-02],
[ 1.0982e-02, -1.2466e-02, -2.5738e-02]],
[[-9.2284e-03, 1.1269e-03, 3.1835e-02],
[-2.1525e-02, -1.2746e-02, 2.2540e-02],
[-2.1287e-02, -1.4105e-02, -4.2326e-03]],
[[-2.8699e-02, -2.1606e-04, 2.1980e-02],
[-1.1902e-02, 6.6218e-03, 1.3280e-02],
[ 1.2504e-04, -5.5446e-03, 1.4436e-02]]],
[[[-7.1501e-03, 5.7026e-03, 4.3995e-03],
[ 8.6375e-03, 9.6421e-03, 6.4732e-03],
[ 3.2578e-02, -4.6799e-03, 2.2420e-02]],
[[-1.0038e-02, -1.6056e-02, -5.6870e-03],
[-1.5084e-02, -2.4560e-02, -1.0203e-02],
[ 8.7223e-03, 5.0839e-04, -1.5827e-02]],
[[-6.0540e-02, -7.9864e-02, -1.4975e-02],
[-2.7965e-02, -3.2907e-02, -5.5244e-04],
[-6.6105e-03, 3.1635e-03, -2.3345e-03]],
...,
[[ 3.3884e-02, 1.1538e-02, -2.1285e-02],
[ 2.5177e-02, 5.6610e-03, -7.5835e-03],
[ 2.9179e-02, 1.6685e-02, -3.9618e-03]],
[[-1.4013e-02, -2.0572e-02, -3.3954e-02],
[ 4.5088e-03, -6.3033e-03, -2.1324e-02],
[ 8.6164e-03, 1.8427e-02, -1.6961e-02]],
[[-3.7631e-02, -4.6700e-02, -4.7414e-02],
[-3.7810e-02, -5.4088e-02, -5.2499e-02],
[ 4.4325e-03, 1.5679e-03, -1.2768e-02]]]])), ('features.16.bias', tensor([ 1.1129e-01, -6.5595e-02, 1.1674e-01, 1.0266e-01, 1.3823e-01,
-1.0093e-01, -3.2487e-02, 2.7276e-02, -1.1670e-01, 4.7968e-02,
-1.3399e-02, -1.5048e-02, 3.0006e-02, 4.6775e-02, 4.5095e-02,
-7.0518e-02, -1.3427e-02, 5.9646e-02, -4.3785e-02, 9.6971e-02,
-5.3902e-02, 1.0772e-01, 2.3518e-02, -8.9939e-02, -3.1992e-03,
7.1295e-02, 5.8106e-02, 3.3375e-01, -4.8308e-02, 3.2969e-02,
-2.4833e-02, -1.9172e-02, 2.7820e-01, -1.1369e-01, 1.2396e-01,
-1.2624e-01, -7.3003e-02, 1.2396e-01, -1.6797e-02, -6.3193e-02,
9.5591e-02, -9.0238e-02, 5.1184e-01, 4.9748e-02, 5.3947e-02,
-5.3706e-02, 1.5368e-03, -6.6550e-03, -5.9959e-02, 4.0225e-02,
-4.1607e-02, 2.4601e-02, 4.3741e-02, -4.0718e-02, -5.7726e-02,
2.4598e-02, -2.2844e-02, -1.2988e-01, 1.6749e-01, 1.8316e-01,
8.0369e-02, -8.1832e-02, -2.2437e-03, -2.9780e-02, 4.8575e-03,
4.8244e-02, -5.5178e-02, -1.5325e-01, 1.2398e-01, 1.3155e-01,
3.3666e-02, -1.1710e-02, -4.5616e-02, -7.1957e-02, 6.0881e-02,
9.5858e-02, -1.1459e-02, -6.3304e-03, 4.7153e-02, -4.4621e-02,
-8.4636e-03, -1.3082e-02, 2.0789e-01, 2.4093e-02, -2.9987e-02,
3.6080e-01, -3.2765e-02, 2.6315e-02, -1.0187e-03, -3.0344e-02,
-7.1845e-02, 5.7727e-02, -4.4341e-02, -6.7018e-03, 2.3211e-02,
1.1662e-01, 1.2081e-01, -9.5667e-02, 7.3824e-02, 2.6407e-02,
-5.2576e-02, 1.7535e-01, 2.0463e-02, -5.3516e-02, 3.9142e-02,
-2.0515e-02, 2.0954e-02, 3.2002e-02, 2.6436e-02, 8.2437e-02,
8.3775e-02, -1.9310e-02, 2.6033e-01, 3.0604e-02, 1.2936e-01,
2.2118e-02, 4.0956e-02, -7.7246e-04, -1.4471e-01, 1.4958e-01,
3.1322e-02, 8.6348e-02, -1.4537e-01, -1.0755e-01, 7.4208e-03,
-7.0875e-02, 8.2423e-02, -5.9071e-02, -7.2380e-02, -7.0351e-02,
1.1574e-01, 1.4701e-01, -2.0563e-03, -5.3370e-02, -4.2828e-02,
-1.1512e-02, -6.3480e-02, -4.6934e-02, 1.7201e-01, 5.4891e-02,
-1.6272e-02, 2.5652e-01, -6.0798e-02, -8.0041e-02, -1.8932e-02,
-1.1083e-01, 2.1932e-02, -1.7786e-02, 1.6212e-02, -3.6363e-02,
-1.4489e-02, 3.9988e-02, 4.8974e-03, 3.2037e-03, 1.6398e-02,
-1.6936e-02, -2.0609e-02, -6.1068e-04, 9.8372e-02, -2.2700e-05,
1.0850e-01, -4.2645e-02, 5.3368e-02, -1.1500e-01, 1.8732e-01,
1.6985e-01, 3.7745e-01, -7.0661e-02, 9.7715e-02, 1.3798e-01,
3.3904e-02, -9.5618e-02, -1.0801e-01, 1.1457e-01, 9.5222e-02,
1.6966e-02, 1.4716e-01, 2.2498e-01, -6.2935e-04, 1.5877e-02,
2.1610e-01, -2.8842e-02, 1.0055e-01, -1.3666e-02, -2.5940e-02,
-5.9442e-03, -8.1836e-02, 1.6198e-01, 2.8364e-03, 9.3723e-02,
3.7836e-02, -2.0277e-02, -4.0936e-02, -2.4426e-02, -1.1784e-02,
-4.1636e-03, 1.1381e-01, 5.5490e-02, 1.1412e-01, -1.6489e-03,
1.1480e-01, 8.7032e-02, -7.5473e-02, 8.8914e-02, 1.6656e-01,
7.7635e-02, 3.7702e-01, 7.6020e-02, -1.9089e-02, -3.4244e-02,
3.2959e-02, 1.3892e-01, -8.1984e-02, 2.4682e-02, -4.5648e-02,
-4.4383e-02, 5.0465e-02, 6.8235e-02, 5.4163e-02, -1.0211e-01,
2.6686e-02, -2.8846e-03, -6.2859e-02, 2.7318e-01, 5.9794e-02,
-7.3154e-02, -6.4157e-02, -1.5369e-02, -7.8196e-02, -4.9677e-02,
-3.1869e-02, 2.7572e-02, 7.9430e-02, 1.0354e-01, 2.6763e-02,
8.0672e-02, 4.9645e-02, -7.7313e-02, -5.8717e-02, -4.2949e-02,
5.2187e-02, 5.7690e-02, 3.5173e-02, 4.9257e-02, -1.0447e-03,
8.1513e-02, 1.5830e-01, 4.3074e-02, 3.4226e-02, -1.2562e-01,
2.3248e-01, 3.4011e-02, 7.0319e-04, 2.3811e-03, -4.4095e-02,
-5.9384e-02])), ('features.19.weight', tensor([[[[ 7.6028e-03, 1.5823e-02, 5.3801e-03],
[ 3.0424e-03, 1.7674e-02, 3.4824e-02],
[ 2.9429e-02, 8.2329e-03, 3.1109e-02]],
[[ 1.8723e-02, 1.4447e-02, -1.9750e-03],
[ 2.0086e-02, -1.5187e-03, 8.5380e-03],
[-3.9245e-03, -9.8634e-04, 9.9558e-03]],
[[-1.4760e-02, -1.2458e-02, 2.2764e-02],
[-1.7024e-02, -3.0757e-02, 7.2522e-03],
[ 8.3806e-03, 9.8483e-03, 1.4754e-02]],
...,
[[-2.5732e-02, -4.7436e-02, 4.9733e-04],
[-1.0372e-03, -5.3227e-03, 1.4351e-02],
[ 2.2109e-02, 6.8368e-03, 1.3325e-02]],
[[-2.3630e-02, -2.6565e-02, -2.1751e-02],
[ 3.3754e-03, -3.7258e-03, -1.5960e-02],
[ 3.3379e-02, 9.6359e-03, -6.6191e-03]],
[[-6.8704e-03, -2.3740e-02, 7.7378e-03],
[-3.4402e-02, -1.6394e-02, -1.1397e-02],
[-8.2355e-03, 1.7833e-04, -1.5460e-02]]],
[[[-1.1351e-02, -1.4132e-02, -1.1851e-02],
[-4.8102e-03, -5.9010e-03, -7.0331e-03],
[ 1.1310e-02, -5.9285e-03, -2.5732e-03]],
[[-1.5444e-02, -6.3216e-03, -4.8372e-03],
[-2.3063e-02, 1.6431e-02, 8.2081e-04],
[-1.0718e-02, 1.6481e-02, -3.1030e-03]],
[[-1.9730e-02, -2.1219e-03, 6.3024e-03],
[-3.2820e-03, 6.9398e-04, -1.4061e-02],
[-2.3641e-02, -1.3739e-02, -2.1028e-02]],
...,
[[ 1.3894e-02, 2.1300e-02, 1.2783e-02],
[ 2.9861e-02, 1.1308e-02, 1.1565e-03],
[ 1.4007e-02, -1.0189e-02, 9.2872e-03]],
[[ 3.4647e-02, 2.0312e-03, -2.2116e-02],
[ 3.4164e-02, 4.0395e-03, -3.6434e-02],
[ 2.4236e-02, -5.8397e-03, -2.0906e-02]],
[[ 3.2749e-03, -1.0375e-03, 8.8837e-03],
[ 2.3392e-02, 4.5150e-03, -4.6558e-03],
[-3.3005e-03, -1.7105e-03, -1.6713e-02]]],
[[[-1.4679e-02, -5.4663e-03, -1.5582e-02],
[ 6.7112e-03, 9.1110e-03, -1.7623e-02],
[ 6.2036e-03, 4.0262e-03, -1.3794e-03]],
[[ 3.9616e-03, 3.8155e-03, 2.1631e-02],
[-4.8252e-03, -9.6756e-03, -1.8287e-02],
[ 5.6259e-03, 8.4448e-03, -1.0573e-02]],
[[ 1.3435e-02, 1.7807e-02, 6.8159e-03],
[ 1.1273e-02, 9.0411e-03, 6.0426e-03],
[ 1.1013e-02, 1.3509e-02, 1.7305e-02]],
...,
[[ 1.8668e-02, 3.5594e-02, -8.3857e-03],
[ 3.0522e-03, 2.0724e-03, -1.6624e-02],
[-9.3803e-03, -3.0223e-02, -2.4824e-02]],
[[-3.5788e-03, 6.5197e-03, 1.4984e-02],
[ 6.3232e-03, -7.1723e-03, 2.9092e-03],
[-5.9776e-03, 1.6511e-02, 6.5239e-03]],
[[-2.6203e-02, -1.0531e-02, 8.6346e-03],
[-1.2161e-02, -1.7158e-02, -6.8837e-03],
[-1.5274e-02, -1.9920e-02, -2.2666e-02]]],
...,
[[[ 2.8817e-02, 5.7064e-02, 2.6786e-02],
[ 2.7877e-02, 4.1337e-02, 3.5111e-02],
[ 2.0855e-02, 8.2302e-03, 2.4153e-02]],
[[ 4.9293e-03, 1.7128e-02, 8.3307e-03],
[-1.3552e-02, -4.0895e-03, 9.5780e-03],
[-3.0197e-03, -1.6909e-02, -6.2344e-04]],
[[ 4.9377e-03, 1.5089e-02, 4.2446e-03],
[-2.1481e-02, -1.0198e-02, -3.2459e-03],
[-9.8801e-03, -2.9522e-02, -2.3120e-02]],
...,
[[ 4.7656e-03, -1.0690e-02, -9.2157e-03],
[ 7.5166e-03, -5.4019e-03, -7.2855e-04],
[-5.2991e-03, -1.5755e-02, -5.9778e-03]],
[[-4.3410e-02, -2.8615e-02, -8.0603e-03],
[-2.1413e-02, 1.2001e-03, 1.2148e-02],
[-1.1315e-03, 5.8441e-04, 3.5502e-02]],
[[-1.6830e-02, -6.9466e-03, 2.2042e-03],
[ 1.3756e-02, 2.5687e-03, -4.5938e-03],
[-1.6030e-02, -2.0392e-02, -8.5904e-03]]],
[[[ 4.5148e-03, 9.7426e-03, 1.1179e-02],
[ 1.6208e-02, 2.1875e-02, 3.0243e-03],
[ 1.4006e-02, 1.2082e-02, -5.5489e-03]],
[[-1.1140e-02, -2.4244e-02, -6.9401e-03],
[ 8.6286e-03, -1.9552e-02, -1.7388e-02],
[ 2.2973e-03, 6.4003e-03, -1.4678e-02]],
[[-3.4594e-03, -2.6753e-02, -2.7685e-02],
[ 7.5820e-04, 3.0425e-03, -6.5056e-04],
[ 1.7231e-03, 8.6383e-03, 8.0574e-03]],
...,
[[ 1.0818e-02, -4.7188e-03, 6.5114e-03],
[ 5.3004e-02, -4.5559e-04, 1.9125e-02],
[ 2.8295e-02, 7.0543e-03, 3.6029e-02]],
[[ 4.4104e-03, -5.3597e-03, -1.6198e-02],
[ 6.7288e-04, -3.5397e-03, -1.0794e-02],
[ 2.6040e-02, 4.2144e-03, -9.4279e-03]],
[[ 8.7836e-04, 9.5618e-03, 3.1830e-02],
[ 4.0046e-03, 1.8659e-03, 5.1198e-03],
[ 1.5731e-02, 4.5981e-03, -1.1626e-02]]],
[[[ 2.4723e-02, 9.8198e-03, 7.3155e-03],
[ 2.1681e-03, -4.8742e-03, 1.5810e-02],
[ 2.5210e-03, -1.7476e-04, 5.4844e-03]],
[[-9.6545e-07, -3.3875e-03, 6.1317e-03],
[-5.6578e-03, -9.1398e-03, -7.5964e-03],
[-5.2246e-03, -2.5943e-03, -4.8435e-03]],
[[-2.1442e-02, -2.6685e-02, -1.6241e-02],
[-1.9050e-02, -3.2640e-02, -2.7540e-02],
[-3.7944e-02, -3.8547e-02, -3.3783e-02]],
...,
[[ 1.5289e-02, 1.0063e-02, 6.3048e-03],
[-8.8774e-03, 8.5941e-04, -4.4288e-04],
[-9.4150e-03, -3.1109e-03, 4.5257e-03]],
[[ 2.1593e-02, 3.5399e-03, -1.7327e-02],
[ 2.1892e-02, -5.4157e-03, -3.1937e-02],
[ 1.4619e-02, 4.0883e-03, -2.5810e-02]],
[[-4.6254e-03, -1.0805e-02, 4.8783e-03],
[-1.5468e-02, -1.4464e-02, -2.8306e-03],
[-3.0682e-02, -1.8622e-02, -2.5366e-03]]]])), ('features.19.bias', tensor([ 4.8539e-02, 1.8543e-01, 1.0437e-02, -5.2374e-02, 2.8441e-02,
5.9573e-02, -3.1299e-02, 4.2916e-02, -8.0564e-03, -1.1774e-02,
-1.4840e-01, -3.9615e-02, -1.4545e-01, -4.1749e-02, -8.9216e-02,
3.1039e-01, 8.0442e-02, -1.7013e-01, 7.1368e-02, 2.1807e-01,
8.4100e-02, -1.1073e-01, -3.4990e-02, -9.2167e-04, 1.4945e-01,
-1.9167e-03, 4.8934e-03, 1.9026e-02, 5.3401e-02, 2.0975e-01,
5.3031e-02, 5.9212e-02, -4.7142e-02, 1.4421e-02, 3.0032e-02,
6.9043e-02, -7.8601e-02, -2.9682e-02, 8.6846e-02, 1.9919e-01,
-7.7397e-02, 1.3564e-01, 4.2665e-02, 3.0389e-02, 8.1631e-02,
1.9028e-02, -7.3459e-02, 5.8934e-03, -1.9776e-02, 5.9806e-02,
3.7804e-02, -1.0258e-01, 3.6124e-02, -4.1583e-02, 3.4364e-02,
-9.0316e-03, 2.8344e-02, 7.5596e-03, -3.2274e-02, 5.5055e-03,
4.6274e-03, 5.2689e-02, -2.4146e-02, -6.5551e-02, 9.0461e-02,
-2.7640e-02, 5.2692e-02, -4.9342e-02, -2.5124e-02, -1.2564e-02,
-2.6508e-02, 5.2646e-02, -7.9319e-02, 3.1563e-02, -7.7553e-03,
-2.8984e-02, -9.4487e-02, -6.6763e-02, 6.0426e-02, -5.3435e-02,
4.1247e-02, -1.8195e-01, -5.0668e-02, 1.0614e-02, -1.3076e-02,
1.5918e-01, -4.8236e-02, 3.3868e-02, 1.8622e-01, -1.0235e-01,
1.0884e-02, 5.1425e-02, -9.2569e-02, -7.7302e-03, 2.0935e-02,
2.7898e-01, 1.2535e-01, -1.2700e-01, -8.3232e-02, -5.5353e-02,
-4.4103e-02, -4.9873e-02, 1.1130e-01, -9.5322e-02, -1.4624e-02,
-4.5828e-02, -4.9705e-02, -5.7023e-02, -3.1006e-02, 5.2477e-02,
1.0429e-02, 1.3837e-01, -3.0914e-03, 5.8667e-03, -1.1464e-01,
2.7715e-01, -3.9966e-02, -2.0109e-02, -1.1062e-01, -1.3977e-01,
-7.3065e-02, -1.4450e-02, -6.5972e-02, 4.5349e-03, -4.6339e-02,
5.9486e-02, -4.8973e-02, -7.2482e-02, 5.9589e-05, 2.7814e-02,
9.6529e-02, 9.1545e-02, -7.7284e-02, 1.6943e-01, -7.2803e-03,
-7.5243e-02, -1.5994e-01, -2.7652e-02, 1.2272e-03, -1.1472e-01,
-7.4792e-02, 8.9549e-02, -7.7055e-02, -2.2745e-02, -7.5935e-02,
2.7557e-01, 5.2467e-02, 5.5881e-03, -8.5725e-03, 1.0043e-01,
-7.2981e-03, 1.9595e-02, -3.6698e-02, -1.3603e-01, -9.0080e-02,
1.0666e-02, -1.8209e-02, -3.6660e-02, -7.8372e-02, -3.0423e-02,
1.3261e-02, -2.5981e-01, 4.2773e-02, -6.8004e-02, -3.9249e-03,
6.3254e-03, 3.1358e-02, 8.2736e-02, -3.7702e-02, 5.4848e-02,
-4.9339e-02, 7.3403e-03, -8.4238e-02, -1.0084e-01, 6.2125e-02,
-7.7764e-02, 2.2460e-01, 4.1738e-02, 2.0611e-02, -1.7923e-02,
8.4160e-02, 4.2331e-02, -4.1427e-02, -9.3484e-02, -4.6644e-02,
7.0029e-03, 1.5403e-01, -5.1248e-02, 2.1564e-02, 2.2336e-02,
6.1144e-02, -4.1649e-02, -1.4518e-01, 4.3635e-02, -4.6644e-02,
1.2000e-01, -9.9771e-03, -4.8811e-02, -8.3224e-03, 3.0715e-03,
1.2432e-01, 7.1385e-02, -2.2899e-02, -1.5567e-01, -3.4630e-02,
2.2629e-02, 8.3408e-02, 1.1082e-02, -9.2070e-02, -5.3920e-02,
-4.1987e-02, 2.3545e-02, 3.6946e-02, 1.5139e-01, -2.1353e-02,
-6.7549e-02, -2.3937e-02, 2.8144e-03, -3.4208e-02, -1.1696e-01,
6.7998e-02, 2.9330e-01, -5.8783e-02, 5.4564e-02, -4.5076e-02,
-8.4100e-03, -1.8029e-02, -5.0699e-02, -1.2053e-01, -1.0290e-01,
6.0891e-02, -4.2419e-02, -6.7452e-02, -6.5255e-02, -1.8794e-03,
7.6948e-02, -9.4908e-02, -4.1975e-02, -3.0443e-02, -3.6526e-03,
6.3435e-02, -4.7602e-02, -5.2816e-02, -8.9736e-03, 1.5673e-02,
-3.4718e-02, 6.2641e-02, -1.1035e-02, 4.8176e-02, -1.2357e-03,
-2.8346e-02, -1.3969e-02, 6.9463e-02, 7.3244e-02, -3.0851e-02,
-8.0603e-02, -8.3256e-02, -3.5268e-02, 1.0762e-01, 1.4447e-02,
-6.2395e-02, 6.7069e-02, -6.0583e-02, 1.1230e-01, -1.2477e-02,
-1.3437e-02, -8.9206e-03, 4.2926e-02, -5.1212e-03, 5.6383e-02,
4.3455e-03, 7.4903e-02, 5.4617e-02, -1.0920e-01, -9.3079e-02,
2.7993e-02, -1.0010e-01, 6.0745e-02, -9.1382e-02, -4.8985e-02,
2.6096e-02, 3.4188e-02, 4.1783e-02, -5.6487e-02, -1.2240e-04,
-7.8672e-02, 3.9104e-02, 1.2764e-01, -8.7900e-03, -5.1296e-02,
1.2932e-01, 7.5961e-02, 1.1269e-02, -2.7532e-02, -1.3130e-02,
1.1197e-02, 1.1543e-01, -9.5668e-03, -6.1839e-02, -4.2182e-02,
-1.6125e-02, 5.9946e-03, 9.4427e-02, 1.9066e-02, 1.1821e-01,
-1.4119e-02, 3.5984e-02, -9.9561e-02, 2.8469e-02, 1.0130e-01,
6.0111e-02, -1.1442e-01, 1.3231e-02, -5.1268e-02, 2.3183e-01,
1.2102e-01, -5.9728e-02, -5.8650e-02, -4.6801e-02, 1.9844e-01,
-6.0913e-02, -5.3739e-02, -2.6642e-02, 1.9160e-01, 3.2030e-02,
4.9399e-02, 6.2993e-02, 3.3099e-02, 2.5505e-01, 6.2240e-02,
2.3869e-02, -5.4111e-02, 1.3214e-01, 9.7300e-02, -9.2146e-02,
6.3166e-02, -6.4136e-02, -1.6049e-02, -8.7184e-03, -1.0280e-02,
-1.0107e-01, -1.0591e-02, 3.5118e-02, 2.4327e-03, -1.1066e-01,
-1.2848e-01, 1.1588e-01, -4.7567e-02, -3.8811e-02, -6.7900e-02,
1.2797e-01, 3.1762e-02, -2.1549e-02, 5.4774e-02, -5.3559e-02,
3.0571e-02, 7.3556e-02, 1.5414e-02, 6.2837e-02, -9.1454e-02,
-1.7830e-01, -1.4753e-02, -1.6593e-02, -8.2559e-02, 1.0468e-01,
8.2539e-02, 2.5728e-01, -5.7627e-02, -8.9256e-02, 6.8375e-02,
-2.6886e-02, -1.2403e-01, 2.6818e-01, 1.0334e-01, -5.4301e-04,
-4.3253e-02, 4.7986e-02, 3.9626e-01, 1.5786e-01, 3.3637e-03,
-1.1727e-01, 1.0403e-01, 4.1580e-02, 1.2370e-01, -3.5918e-02,
-5.9192e-02, 1.8550e-01, 7.7097e-02, -5.5645e-03, -3.7177e-02,
-5.6007e-02, 3.8158e-02, 1.9851e-02, -2.5048e-02, -8.8209e-02,
-6.6868e-03, -3.2120e-03, 1.8791e-01, -8.7897e-03, 9.2033e-02,
-1.0737e-02, 6.5269e-02, 9.0359e-03, -9.4758e-03, 2.1492e-02,
-4.4483e-02, -1.1424e-02, 2.5385e-02, 5.0426e-02, -9.4697e-02,
-3.2992e-03, 2.9230e-02, -1.0388e-01, 1.0746e-01, 4.2220e-02,
-1.0703e-01, -9.6860e-02, -4.5661e-02, -1.1000e-01, -1.5803e-02,
4.2221e-02, -9.1810e-02, -2.1710e-02, -5.8438e-02, -6.6940e-02,
3.1303e-02, -1.1083e-02, -9.9117e-02, -4.5597e-02, -1.8961e-02,
3.2217e-02, 2.5971e-02, -5.6648e-02, -6.3156e-02, 2.2318e-03,
-7.2861e-02, -4.0914e-02, 3.9712e-02, 1.2875e-03, -5.0004e-02,
3.5050e-02, -1.4574e-01, 7.5313e-02, 1.3336e-01, -3.1844e-02,
-6.0656e-02, -3.8036e-02, 3.1332e-02, -7.1490e-03, 1.9854e-01,
-1.3675e-02, -5.1290e-03, -6.6648e-03, -7.8550e-02, 3.9882e-02,
9.4939e-02, -4.9912e-02, -7.0492e-02, 6.0279e-02, 2.1338e-01,
-1.4929e-02, 2.4419e-01, -6.5588e-02, -8.1514e-02, -5.3073e-02,
-5.3437e-02, -1.1278e-02, 2.4811e-01, -9.8490e-03, -5.5535e-02,
5.1402e-02, -4.2619e-02, 1.9044e-03, -4.4433e-02, 2.2332e-02,
-2.9031e-02, -2.9819e-02, 1.6947e-01, -9.8397e-02, -2.8999e-02,
1.5747e-02, 3.0632e-02, -4.0140e-02, 8.9229e-02, 1.8694e-01,
-5.5201e-02, -4.1285e-02, -3.0612e-01, 3.8582e-02, 6.4804e-02,
4.9554e-03, 1.1429e-01, 1.9202e-01, 4.9060e-02, -6.1881e-02,
-4.2886e-02, 1.7173e-02, 2.6790e-02, 5.2347e-02, -7.5116e-02,
-2.9761e-02, 1.5196e-01, -6.2845e-02, 2.1838e-02, -1.9060e-02,
-1.0153e-01, 7.1881e-03, -5.7824e-02, -5.7822e-02, -3.4541e-02,
-5.8572e-02, 1.6760e-01])), ('features.21.weight', tensor([[[[-6.3067e-02, -2.6282e-02, 2.0478e-03],
[-6.5387e-02, -2.5262e-02, 1.5350e-03],
[-3.2395e-02, -2.4593e-02, -1.6119e-02]],
[[ 1.4005e-03, 1.5848e-02, 2.2248e-02],
[-6.5395e-03, 8.8427e-03, -6.1951e-03],
[-8.3691e-03, -4.9664e-03, -2.3819e-02]],
[[-1.0673e-02, -7.9837e-03, -5.1039e-03],
[ 3.1638e-04, 1.2396e-02, 6.1134e-03],
[ 1.0141e-02, 2.8345e-02, 1.8650e-02]],
...,
[[ 1.3700e-02, 1.5561e-02, 1.3848e-02],
[ 1.7697e-02, 8.2817e-03, 1.1439e-02],
[ 2.9955e-04, -8.5686e-03, 4.2489e-03]],
[[ 3.4649e-03, 2.3276e-03, -3.2113e-03],
[-8.5441e-03, -6.6467e-04, 1.4675e-02],
[ 7.4824e-03, -4.6237e-03, 9.1847e-03]],
[[-5.1088e-03, -5.4534e-03, -9.6147e-04],
[-1.1531e-02, -1.8771e-02, -6.8382e-03],
[-1.3029e-02, -4.4073e-03, 1.7698e-03]]],
[[[-2.1158e-02, -8.0839e-03, -6.4797e-03],
[-8.6919e-03, -1.0732e-02, -8.7086e-03],
[-2.4214e-04, -1.2195e-02, -1.2103e-02]],
[[ 1.7100e-02, 2.6715e-02, 2.3094e-02],
[ 2.6800e-02, 1.6322e-02, 2.2969e-02],
[ 6.1205e-03, -3.0861e-03, 4.8568e-03]],
[[-5.9623e-03, -1.5155e-02, -1.2942e-02],
[-1.3615e-02, -2.1252e-02, -1.2252e-02],
[-5.6196e-03, -8.4693e-03, -3.3066e-03]],
...,
[[ 5.9291e-03, -3.9783e-03, -5.7702e-03],
[ 6.7534e-03, -1.1587e-02, -6.9357e-03],
[ 2.8941e-02, 2.3353e-02, 1.8950e-02]],
[[ 1.2965e-02, 1.3479e-02, 1.0113e-02],
[ 6.1548e-03, 1.5268e-03, -1.8471e-02],
[ 4.9778e-03, -1.2377e-02, -2.3814e-03]],
[[ 1.0081e-02, 1.5100e-02, 2.1001e-03],
[-1.5220e-03, 1.5255e-03, 1.8809e-03],
[-7.3236e-03, -3.4785e-03, -1.8081e-05]]],
[[[ 5.3886e-03, 1.4384e-02, 2.4193e-02],
[ 3.5856e-02, 1.3598e-02, 9.1420e-03],
[ 2.7422e-02, -5.7670e-03, -2.3722e-02]],
[[ 1.4028e-04, 2.5377e-03, -1.7109e-02],
[-1.0425e-03, 7.6473e-03, -9.6282e-03],
[-9.7224e-03, -3.0599e-04, 8.4231e-05]],
[[ 1.1451e-02, 1.9407e-02, 1.1275e-02],
[-2.3620e-02, -2.2597e-02, -1.6060e-02],
[-2.1398e-02, -1.6749e-02, -1.7806e-02]],
...,
[[-4.0523e-03, 7.9542e-03, 1.3295e-02],
[-1.0381e-02, -8.1025e-03, 9.6132e-04],
[-2.3355e-03, -1.3688e-02, -1.3965e-02]],
[[-1.5433e-02, 8.8126e-03, 1.3177e-02],
[-3.4207e-03, 1.9939e-02, 2.7096e-04],
[ 4.9602e-03, 2.1484e-02, 6.5832e-03]],
[[-5.9714e-03, -5.7486e-03, 1.2835e-02],
[-7.0907e-03, -2.6756e-03, 9.4222e-03],
[ 2.3858e-03, 3.3486e-04, 1.3487e-02]]],
...,
[[[ 1.1402e-02, -8.7235e-03, -9.6955e-03],
[ 5.0083e-03, -1.8251e-02, 1.5054e-02],
[ 6.2961e-03, 1.8800e-03, 1.7217e-02]],
[[-1.8076e-03, 8.0578e-03, 8.7655e-03],
[-7.5966e-04, 1.1668e-02, 7.2065e-03],
[-4.0283e-03, 1.4189e-02, 1.4049e-02]],
[[ 6.5631e-06, -9.8930e-03, -4.9429e-03],
[ 6.0358e-04, -1.3708e-02, -1.1244e-02],
[-2.1108e-03, -1.5522e-02, -1.0663e-02]],
...,
[[-3.1722e-03, -3.1996e-03, 9.7550e-03],
[-7.9536e-03, 7.7292e-03, -7.1677e-03],
[-2.8470e-03, 5.7877e-04, 5.9350e-03]],
[[-5.5721e-03, -1.8342e-03, 1.3169e-02],
[-1.8986e-02, -6.2273e-03, 7.2322e-03],
[-5.4841e-03, -1.5104e-02, 5.5088e-03]],
[[-5.7446e-03, -1.7867e-05, 1.4082e-02],
[-2.0141e-02, 9.2782e-03, 2.3557e-02],
[-1.2853e-02, 1.8403e-03, 3.3924e-02]]],
[[[ 1.0337e-02, -1.1769e-02, -1.9753e-02],
[ 1.1631e-02, -1.8573e-03, -1.1960e-02],
[ 3.2027e-02, 2.7565e-02, 2.0444e-02]],
[[-1.1984e-02, 9.4871e-03, 5.7944e-03],
[-1.8031e-02, -1.3659e-02, -1.3353e-03],
[-5.7118e-03, -1.3395e-02, -1.2731e-02]],
[[-6.2501e-03, -5.1844e-03, -5.8465e-03],
[-1.1531e-02, -1.1297e-03, -6.9543e-03],
[-1.4874e-02, -1.1113e-02, -8.9067e-03]],
...,
[[-3.1893e-03, -3.4867e-03, -7.8033e-03],
[ 3.2946e-03, -1.6799e-02, -2.6469e-02],
[ 7.4793e-03, 2.5874e-03, -1.1537e-02]],
[[ 1.2194e-02, 1.2353e-02, 1.4823e-04],
[ 6.3877e-03, 2.2248e-02, 2.7899e-02],
[ 1.0109e-02, 8.6574e-03, 1.6423e-02]],
[[ 2.2551e-02, 8.8039e-03, 5.7754e-03],
[ 2.2307e-02, 2.1129e-02, 1.0335e-02],
[ 1.6131e-02, 5.8890e-03, 1.1387e-02]]],
[[[-1.9857e-02, -5.6656e-03, -1.1866e-02],
[-1.6634e-02, -1.0498e-02, -1.5674e-02],
[-2.3948e-02, -2.2151e-02, 8.2650e-04]],
[[ 2.5921e-02, 1.1912e-02, 1.9136e-02],
[ 1.3309e-02, 1.2539e-03, 1.3614e-02],
[ 1.8386e-03, -5.7252e-03, 4.9213e-03]],
[[ 7.9654e-04, 1.7148e-03, -1.4521e-03],
[-4.2225e-03, 1.0216e-03, -4.9709e-03],
[ 6.9864e-03, -7.1502e-04, 2.0902e-03]],
...,
[[-1.8430e-03, 1.6867e-02, 4.2366e-03],
[-6.1424e-03, -2.3727e-02, -2.0847e-02],
[-2.6380e-03, -8.3771e-03, -1.7963e-02]],
[[ 1.0369e-02, 2.0357e-03, -5.3637e-03],
[-2.2568e-03, 1.4899e-03, 3.9693e-03],
[ 1.1021e-02, 5.1284e-03, 2.3431e-03]],
[[ 1.3434e-02, 2.0415e-02, 1.9769e-02],
[-1.2644e-02, -1.9282e-03, 5.9442e-03],
[ 8.3090e-04, 2.0502e-03, -2.6332e-04]]]])), ('features.21.bias', tensor([ 6.3634e-02, -1.0967e-02, -3.2003e-02, 2.1559e-01, 1.5291e-02,
2.0261e-02, 1.3345e-01, 2.6394e-03, 1.0827e-02, -8.4166e-02,
1.1893e-01, -1.0924e-01, 4.0563e-02, -5.8301e-04, 1.0442e-01,
8.1165e-02, 7.9415e-03, 1.0516e-01, -2.8611e-02, 1.3504e-01,
5.9757e-02, 9.8144e-03, -7.2917e-02, 3.1385e-02, 1.7999e-01,
2.6940e-02, 1.6145e-01, 8.0286e-02, -1.9479e-02, 3.8739e-02,
-7.1590e-03, 7.8570e-03, 8.7725e-02, 4.8550e-03, 2.2504e-02,
5.3431e-02, 3.5247e-02, 8.5775e-02, 1.3301e-03, -1.9315e-03,
1.2969e-01, 8.6063e-03, 2.1605e-01, 1.3279e-02, 1.8656e-01,
7.5895e-03, -1.8299e-03, -5.7994e-02, 6.6223e-02, 7.6909e-02,
2.1927e-02, 8.8756e-02, -4.8525e-02, 1.2341e-02, 4.9017e-02,
-1.0056e-02, 4.7132e-02, 7.0428e-02, -2.1244e-02, 4.1273e-02,
3.5426e-03, -2.3756e-01, -7.7789e-02, -8.4740e-02, 8.0331e-02,
1.0795e-01, 6.5318e-03, -8.7282e-02, -1.5889e-01, 1.9795e-01,
-7.0154e-02, 1.3406e-01, 4.3418e-02, -1.2304e-01, 6.7416e-02,
6.9133e-02, 9.2467e-03, 3.0974e-02, 5.5059e-02, -2.0361e-02,
2.4571e-01, 2.7463e-01, 5.7688e-02, 5.3989e-02, -1.6555e-01,
-9.4396e-02, 3.7043e-02, 7.9657e-02, -2.7590e-04, 2.9020e-02,
-4.3371e-02, -1.0061e-02, -5.2683e-01, 4.2785e-02, -4.4344e-02,
-2.1241e-02, -2.8258e-02, 6.8707e-02, 4.3348e-02, 7.6244e-02,
1.0081e-01, -2.1258e-02, -8.4838e-02, 5.4727e-02, 1.6625e-01,
-5.9622e-02, 1.2124e-01, -7.0624e-02, -5.5650e-02, -8.3777e-02,
1.5211e-01, -1.6716e-01, 2.9240e-01, -2.6863e-02, 1.6461e-01,
6.1247e-02, -1.1812e-01, 7.6181e-02, 1.3911e-01, -1.5507e-02,
-5.7944e-02, -3.3484e-02, 3.7752e-02, -2.3442e-02, 1.8417e-01,
-1.1959e-01, 4.2010e-01, -2.0901e-01, 4.5619e-02, 1.8324e-02,
7.5185e-02, -2.1668e-01, 5.9447e-02, -2.8442e-01, -1.0359e-01,
-6.6070e-02, -3.3814e-02, 3.5973e-02, 1.8360e-02, 9.5702e-02,
-1.4127e-02, 8.8156e-04, 7.7657e-02, -1.3281e-02, 6.5500e-02,
2.2525e-02, 9.9867e-02, 3.3745e-02, -2.6081e-02, -1.8246e-02,
1.3688e-01, 8.5862e-02, -6.2230e-02, -9.7107e-02, 1.4473e-02,
2.1385e-02, 2.3951e-01, -8.1395e-02, -6.6086e-02, 7.5855e-02,
1.1688e-01, 7.0092e-02, 9.2100e-02, -5.8266e-02, -2.5284e-02,
-2.9728e-02, -3.6264e-03, 1.0980e-01, 9.8353e-02, 1.1732e-01,
6.2003e-04, 8.1404e-03, -5.5455e-02, 1.4891e-02, 4.9745e-02,
8.0499e-02, 1.7418e-03, -4.9516e-02, -1.3145e-02, 1.2021e-02,
1.7709e-02, 6.4688e-02, 1.6491e-02, -7.0984e-02, 5.8787e-02,
-3.4248e-02, 1.3473e-01, 5.3230e-02, 5.8115e-02, -2.1307e-01,
1.2444e-01, 6.8019e-02, 7.2520e-02, -3.0275e-03, 3.6072e-03,
-2.7347e-02, -6.7599e-02, -4.5567e-02, -2.4351e-02, -1.7376e-02,
9.7094e-02, 1.3540e-01, 1.3309e-01, 9.0295e-02, 3.1864e-02,
1.3095e-01, -1.7897e-01, -1.2532e-01, -5.5461e-02, 3.4644e-03,
1.1916e-01, 1.0131e-01, 7.2864e-02, 1.5379e-01, 1.1285e-02,
2.6610e-01, 8.1305e-02, -3.5514e-03, 1.4103e-01, 9.6407e-02,
9.9711e-02, 1.0446e-01, -3.9848e-03, 1.0569e-01, -6.4742e-02,
-7.2812e-02, -6.8756e-04, -1.2480e-01, 5.1828e-03, -1.6377e-02,
2.6813e-03, 2.2378e-01, -2.8535e-02, 6.9779e-02, 6.4321e-02,
-4.5533e-01, 1.2676e-01, -2.0388e-02, 4.4023e-02, 1.3902e-02,
2.5257e-02, -5.1273e-02, 8.4782e-02, -1.9547e-01, -2.8506e-02,
1.0772e-01, 7.5899e-02, 9.4364e-03, 5.8231e-02, -5.1742e-02,
3.5049e-02, 1.7808e-01, 3.2129e-02, 5.9625e-03, -1.5778e-02,
1.6101e-01, 4.3531e-02, 1.2852e-01, 8.2206e-02, 1.1279e-02,
-1.2372e-01, 1.7948e-02, -2.0397e-02, 3.3083e-02, -3.2450e-01,
4.3616e-02, -2.3773e-02, -1.8045e-02, -3.8831e-02, 7.4837e-02,
4.0458e-02, 1.1571e-01, 6.2879e-02, -7.8334e-03, -1.4513e-01,
6.8077e-02, -2.3668e-02, -9.7854e-03, 7.3647e-02, -7.1214e-03,
9.8975e-03, -3.4187e-02, -5.9872e-02, 7.0949e-02, 2.0886e-02,
-1.4696e-01, 3.7426e-02, 1.3269e-02, 1.7200e-01, -3.4546e-02,
2.5491e-01, -8.0800e-02, 4.7864e-02, 6.6607e-02, 2.2006e-02,
-2.8471e-02, 2.4841e-01, -3.0223e-02, 2.3148e-01, 3.0716e-02,
4.0106e-01, 1.2361e-01, -7.1742e-02, 4.7943e-02, 6.9409e-03,
1.5807e-01, -6.8220e-03, 1.6396e-02, 1.0002e-01, -3.0787e-02,
-3.7432e-02, 4.4643e-02, -9.0015e-02, 8.1886e-02, -5.8072e-03,
-2.3569e-01, 1.5767e-02, -4.4003e-02, -1.2684e-02, 1.3281e-01,
1.8176e-02, 3.5537e-02, 1.6213e-01, 4.8974e-03, 1.1320e-01,
1.4529e-02, 7.0981e-02, -5.2959e-03, 1.2178e-01, 1.4223e-01,
-1.0573e-02, 6.8792e-02, 4.2784e-02, -1.0714e-01, 2.3512e-01,
-3.7534e-02, 7.8099e-02, 1.2942e-01, 2.3506e-02, -1.0872e-01,
1.6058e-02, 2.2538e-02, 1.3274e-02, 1.7579e-01, 1.3075e-01,
3.8376e-02, -6.7803e-02, 1.6718e-01, 1.1343e-01, -8.5305e-02,
5.7568e-02, -2.2814e-02, -3.2776e-02, -1.2268e-01, -3.3054e-02,
4.0078e-02, 1.0197e-02, 2.3004e-02, 1.7812e-02, 2.5393e-02,
-1.2048e-01, 1.1569e-01, -7.9576e-02, 2.8545e-02, -7.3535e-02,
3.9270e-02, 7.9008e-02, 1.5001e-02, 5.4116e-02, 3.2493e-02,
-1.7649e-02, 7.8840e-02, 7.1999e-02, 9.3409e-02, 5.3340e-02,
-4.3711e-02, 1.2580e-01, -5.4338e-02, 9.3274e-02, -2.1260e-01,
-5.1690e-03, -3.7454e-02, 1.2110e-02, -2.2991e-02, 5.6710e-02,
-3.3027e-02, 1.9783e-01, 5.3072e-02, 6.9308e-02, 5.1325e-02,
2.2928e-01, -5.1877e-02, 6.8021e-03, 1.1477e-02, -3.0954e-02,
-7.3070e-02, 5.8915e-02, 1.1739e-01, 1.2369e-01, 1.9607e-02,
-1.4208e-02, -1.6941e-02, 6.9624e-02, -1.0058e-01, 7.1995e-02,
-8.4531e-02, -5.9671e-02, -7.9782e-02, -6.0475e-02, 7.3756e-02,
5.6674e-02, -1.4331e-02, 1.4957e-01, -6.3951e-02, 8.9822e-03,
1.8146e-01, -2.0714e-01, -1.1522e-01, 5.8012e-02, 2.0544e-02,
-1.1090e-02, -1.4533e-02, -5.4449e-02, 2.2929e-02, 1.0210e-01,
1.0104e-02, -9.5641e-02, 1.1436e-02, 6.9774e-02, 2.7891e-02,
5.5458e-02, -2.0825e-01, 8.1965e-02, 6.3960e-03, 1.2738e-01,
-4.0468e-02, -9.1170e-02, 1.2352e-01, 1.1425e-01, -1.0065e-01,
6.5737e-02, 9.7217e-02, -1.5311e-01, 1.1020e-01, 1.0239e-01,
8.4419e-03, 6.2746e-02, -9.7853e-03, -6.6968e-02, 8.0429e-02,
1.3746e-01, 1.8424e-01, 5.8073e-02, 1.0730e-01, 1.9464e-01,
1.0486e-02, -2.1657e-01, -1.3406e-01, -3.8454e-02, 1.4156e-01,
9.7917e-02, -6.1036e-02, 1.7651e-01, -2.3960e-02, 3.7130e-01,
3.0424e-01, 1.1539e-01, 1.2905e-01, 1.0916e-01, 1.1926e-02,
6.1304e-02, 1.1306e-01, 6.2640e-02, 2.1442e-02, 9.5504e-03,
-5.7049e-02, 1.1354e-02, -3.3664e-02, -1.9768e-01, -9.5412e-02,
-1.2056e-02, 1.7186e-01, -4.2196e-02, 1.3421e-02, 2.5391e-01,
-8.1678e-02, 4.1609e-02, 9.9354e-02, 6.5086e-02, 2.1555e-02,
-2.8555e-02, 3.9176e-03, -6.4248e-02, -5.0227e-01, -1.3557e-01,
4.4501e-02, 4.3929e-02, -1.8375e-01, 4.1839e-03, -4.0164e-02,
-1.5749e-01, 1.7741e-01, -1.2139e-02, 1.7253e-01, -1.2950e-01,
4.2050e-02, 5.3931e-02, 1.2980e-01, -1.4973e-02, 1.6851e-01,
8.5195e-02, 4.1993e-02])), ('features.23.weight', tensor([[[[ 2.9275e-03, -1.2253e-02, -2.5162e-02],
[ 1.3126e-02, 1.6620e-02, -6.9234e-03],
[ 3.0615e-02, 3.2961e-02, 2.5451e-02]],
[[-8.2666e-03, -1.4477e-02, -4.9849e-03],
[-1.1521e-02, -1.3034e-03, 7.6385e-03],
[-9.7245e-03, -3.4838e-03, -8.0309e-03]],
[[-1.8526e-02, -2.8393e-02, -2.5642e-02],
[-2.7893e-03, -1.4746e-02, -2.3787e-02],
[-1.0761e-02, -8.3552e-03, -1.4735e-03]],
...,
[[-2.0491e-02, 5.2775e-03, 2.6757e-02],
[-9.8493e-03, 3.5135e-02, 4.3104e-02],
[ 7.1181e-03, 4.0697e-02, 4.2544e-02]],
[[ 2.2077e-03, 9.6445e-03, 7.9257e-03],
[-6.6078e-03, 4.4246e-03, -1.5106e-05],
[-2.3204e-02, -1.3040e-02, -2.5299e-02]],
[[-8.2611e-03, 1.4543e-03, 6.2370e-03],
[-1.9666e-02, -1.2721e-02, 1.9075e-03],
[-9.0014e-03, -7.5546e-04, 1.5022e-02]]],
[[[-1.9414e-03, 1.1059e-02, 2.4792e-02],
[ 1.4155e-02, -9.1138e-05, 9.2181e-03],
[ 1.9704e-02, 1.7571e-02, 1.4511e-02]],
[[-5.0768e-03, -6.6096e-03, -2.0802e-04],
[-1.8916e-02, -1.0596e-02, -8.8739e-03],
[-1.8497e-02, -7.9947e-03, -1.4719e-02]],
[[-7.5639e-03, -1.7445e-02, -1.3920e-02],
[-2.4784e-03, -1.3421e-02, -1.8796e-03],
[-7.4250e-03, -1.8926e-02, -8.3590e-03]],
...,
[[-1.3505e-02, -5.3580e-03, -6.5088e-03],
[-4.0295e-03, 3.6336e-04, -1.5681e-03],
[-6.4124e-03, 4.5903e-03, -3.9457e-03]],
[[ 2.1979e-03, -9.2777e-03, -1.7914e-02],
[ 1.1059e-02, 4.0773e-03, 4.0868e-06],
[-1.3653e-03, 1.6354e-02, 2.0493e-02]],
[[ 7.8404e-04, -7.1808e-03, 1.0719e-02],
[ 1.5481e-02, 9.9563e-03, 2.7061e-03],
[ 4.0220e-03, 5.8512e-03, -2.4787e-03]]],
[[[ 1.6229e-02, 1.5673e-03, -2.8752e-03],
[ 4.6689e-03, -1.8698e-03, -5.2654e-03],
[-1.2270e-02, -2.1086e-02, -8.2690e-03]],
[[-2.3033e-02, -1.7427e-02, -1.4707e-02],
[-1.2883e-02, -3.8556e-03, -6.7070e-03],
[ 6.6473e-04, -4.8108e-03, -2.1723e-03]],
[[ 1.9949e-02, 2.4207e-02, 2.3978e-02],
[-3.4677e-03, -1.7741e-02, 3.7338e-03],
[ 6.3721e-03, -3.7063e-03, -8.4862e-03]],
...,
[[-2.6620e-03, -8.8223e-03, -1.9227e-03],
[-6.9083e-03, -4.4090e-03, -1.5016e-05],
[ 1.6014e-02, 2.2941e-02, 7.6902e-03]],
[[-1.4060e-02, -3.5904e-03, 8.0155e-03],
[-2.5706e-02, -1.9520e-02, -6.0274e-03],
[-1.0995e-02, -3.1026e-03, -8.5401e-03]],
[[ 2.4463e-02, 1.3628e-02, 1.2356e-02],
[ 8.7643e-03, -6.0888e-03, -1.5929e-02],
[-3.2960e-03, -4.7157e-03, -1.2753e-02]]],
...,
[[[ 1.0178e-02, -1.1691e-03, -1.4742e-02],
[ 3.9346e-03, 9.5375e-03, -9.7074e-03],
[-4.8433e-03, -8.6996e-03, 5.4476e-04]],
[[ 2.7464e-03, 9.2587e-03, 2.4756e-02],
[-1.0271e-02, -1.5099e-02, -1.5854e-02],
[-2.2313e-02, -2.7761e-02, -2.5646e-02]],
[[-1.2841e-03, 7.7848e-03, -5.7421e-04],
[ 3.9848e-03, -7.6260e-03, -6.5187e-03],
[ 1.9648e-02, 2.0401e-03, -1.7519e-02]],
...,
[[ 2.6590e-02, 2.7680e-02, 5.2753e-03],
[ 8.2830e-03, 1.3173e-02, -5.6917e-03],
[ 1.5508e-02, 1.5699e-02, 1.0786e-03]],
[[ 1.0243e-02, 1.2398e-02, 1.4319e-02],
[ 9.9900e-03, -7.9246e-04, -1.7649e-02],
[ 2.8109e-02, 1.6701e-02, 1.2946e-02]],
[[-1.3918e-03, -3.7564e-03, -3.3816e-03],
[-2.3800e-02, -1.0440e-02, -1.4325e-02],
[-1.9435e-02, -2.2390e-02, -3.3416e-02]]],
[[[-2.7577e-02, -2.5263e-02, -1.9194e-03],
[-2.1418e-02, -7.1673e-03, 3.2465e-03],
[-1.4101e-03, 2.0002e-02, 1.7160e-02]],
[[-1.3343e-02, -1.5340e-02, -7.8940e-03],
[-7.0566e-03, -2.2347e-03, -5.9629e-05],
[-8.6561e-03, 3.5798e-04, -2.3628e-03]],
[[-6.2963e-03, -1.8472e-02, -1.9626e-02],
[ 2.3775e-03, -4.2304e-03, -2.1733e-02],
[-3.1886e-03, 1.9573e-03, -1.0606e-02]],
...,
[[ 1.8556e-02, 1.8521e-02, 1.5854e-02],
[ 1.0856e-02, -6.0481e-03, -4.9853e-03],
[ 1.7463e-03, -8.6839e-03, -1.0171e-02]],
[[ 5.2912e-03, -6.7292e-03, 2.1528e-03],
[-1.1438e-02, -1.5423e-02, -1.2464e-02],
[-1.7660e-02, -1.9785e-03, -5.4980e-04]],
[[-1.6906e-02, -8.2940e-03, -5.1365e-03],
[-5.5470e-03, -8.9329e-03, -8.0640e-04],
[-5.8723e-03, -6.4249e-03, -2.1500e-03]]],
[[[-1.7504e-02, -1.8783e-02, -2.3246e-03],
[-2.4825e-02, -4.8844e-03, 1.2975e-03],
[-2.0398e-03, 1.0956e-03, 4.5109e-03]],
[[-4.6269e-03, -1.0311e-02, 3.5836e-03],
[-2.5351e-02, -1.9044e-02, -1.6829e-02],
[ 2.2080e-03, -1.1279e-03, 1.1406e-03]],
[[ 2.5150e-02, 7.7548e-03, 1.3852e-03],
[ 4.1909e-02, 4.4223e-02, 9.1024e-03],
[-6.7578e-03, -2.8110e-03, -1.0126e-02]],
...,
[[ 1.7198e-03, -2.0365e-02, -1.9527e-02],
[-2.5652e-03, -9.6311e-03, -1.4540e-02],
[-6.3745e-03, -1.6574e-02, -1.2737e-02]],
[[-1.7682e-02, -1.2860e-02, -1.0523e-02],
[-2.1130e-02, -1.7830e-02, -1.8416e-02],
[-1.6107e-02, -2.1336e-02, -1.9041e-02]],
[[-3.4088e-03, -5.9401e-05, -4.8440e-03],
[-1.0921e-02, -8.1715e-03, -5.3294e-03],
[-6.2645e-03, -1.5367e-02, -7.4189e-03]]]])), ('features.23.bias', tensor([ 9.4337e-02, 2.9287e-02, -3.8371e-02, 1.3606e-01, 4.3968e-03,
1.1473e-01, -1.0727e-03, -5.0280e-02, 5.6087e-02, 5.7233e-02,
-6.0927e-02, -2.3587e-02, -6.8871e-02, 1.2800e-02, 1.2950e-01,
1.3269e-01, -4.5842e-02, 4.3089e-02, 6.9104e-02, 3.4356e-02,
1.8842e-02, 1.3120e-01, 8.2590e-02, -1.9884e-01, 2.0686e-01,
1.3923e-02, 8.9916e-02, 4.8529e-02, 1.2118e-01, -4.8104e-02,
-8.2999e-03, -1.8856e-01, 8.4949e-02, 8.2944e-02, 7.4811e-02,
-2.3402e-02, 6.2097e-02, 2.5934e-03, 8.0791e-03, 9.1788e-02,
2.2047e-01, 1.5847e-01, 1.2807e-01, -2.7739e-02, -8.8928e-02,
1.6613e-02, 4.0772e-02, 1.5401e-02, -4.9482e-02, 1.0952e-01,
8.8412e-02, 9.5372e-02, 7.3259e-02, 1.1489e-01, 1.7568e-01,
1.0645e-01, -1.6068e-02, 1.4049e-02, -9.3792e-02, -2.3967e-02,
-7.2631e-02, 5.0305e-02, -7.5118e-02, 3.2870e-01, 1.7454e-03,
2.7082e-02, -1.5043e-01, 1.1249e-01, 7.7925e-02, 1.3057e-01,
-9.1783e-02, 1.2405e-01, 1.0877e-01, -5.6156e-03, -2.3025e-01,
-1.1929e-02, 7.3653e-02, -6.4028e-02, -8.1795e-04, 1.0986e-01,
6.5432e-03, 1.3136e-01, 7.4875e-02, 1.0522e-01, 4.3962e-02,
4.2369e-02, 1.1133e-01, 3.3964e-03, 9.8447e-03, 2.9549e-02,
2.2290e-01, -6.7186e-02, 2.7830e-01, 4.3073e-02, 1.4167e-01,
-2.5318e-02, 2.3649e-02, 7.5709e-02, 6.6288e-02, 8.8698e-02,
-1.5102e-02, 3.1887e-01, 1.1183e-01, 7.2410e-03, 4.6747e-02,
9.5640e-02, -1.7531e-01, 1.9800e-02, -2.2655e-02, 9.5847e-02,
1.5426e-01, 1.1512e-01, -6.0792e-03, -4.4338e-01, 5.5095e-02,
3.9680e-02, -2.2501e-02, -2.7456e-02, 7.3925e-03, 9.9935e-02,
4.6945e-02, 1.2395e-01, 8.5060e-03, -9.3015e-02, 4.2720e-02,
1.6961e-01, 1.5117e-01, -1.3237e-01, -3.8554e-02, 2.7264e-02,
7.8922e-02, 2.7863e-02, 4.8082e-02, -7.0598e-02, -3.1668e-01,
1.6324e-02, 5.3649e-02, -3.4827e-02, 1.4205e-01, 5.3182e-03,
5.9532e-03, -1.8309e-02, 2.3356e-01, 6.1584e-02, 1.0796e-01,
1.4659e-01, 8.2902e-02, 1.2055e-01, 6.3537e-03, -7.7341e-01,
4.5368e-02, 9.4916e-02, 1.2465e-01, -1.3629e-01, -2.0263e-01,
1.3314e-01, 1.8183e-01, 1.8420e-01, 1.1596e-01, 7.8179e-02,
-1.1020e-02, 6.2291e-02, 2.4924e-02, 4.3661e-02, -4.1048e-02,
1.4613e-01, 5.6160e-03, 1.1711e-01, 1.4817e-01, 3.7262e-02,
1.8522e-02, -3.2286e-02, 1.4468e-02, 2.8628e-02, 3.8756e-03,
6.9308e-03, 7.9477e-03, -4.2334e-01, 1.9054e-01, -1.6487e+00,
9.2094e-03, 8.1591e-02, 2.1409e-01, 3.1032e-02, 3.7052e-02,
4.1859e-02, -3.5401e-01, 1.0451e-02, 9.9247e-02, 1.0851e-01,
1.4963e-01, -1.4459e-01, -2.5997e-01, -1.6123e-02, -3.8469e-02,
6.2799e-02, -6.2628e-02, 1.0104e-01, 2.0208e-01, 1.0101e-01,
7.4882e-02, 1.3406e-01, 9.4736e-03, 1.5951e-01, 2.6138e-02,
1.7791e-01, 4.8678e-02, -1.7155e-02, -3.3741e-02, 4.8767e-02,
5.3774e-03, 4.9498e-02, -1.0032e-02, 2.7699e-02, 1.4853e-02,
1.1034e-01, -1.1143e-02, -7.2846e-03, -4.4818e-02, -2.0328e-01,
1.6183e-01, -4.8601e-02, -5.1872e-04, 1.0944e-01, 8.4800e-02,
-3.3689e-02, 5.6162e-02, 2.8864e-02, 6.1779e-02, 1.0249e-01,
1.6590e-01, 6.9486e-02, -1.6802e-01, 2.0560e-01, 8.8492e-02,
3.1583e-02, -4.5958e-02, 1.4457e-01, 9.0422e-02, -1.8504e-02,
6.6698e-02, -9.1017e-02, 2.1632e-02, -2.3687e-02, -1.3944e-01,
5.7714e-02, 2.3237e-02, 5.6633e-03, 3.3012e-02, 1.7218e-01,
4.5760e-02, -4.0419e-02, 7.3764e-03, 1.0328e-01, -8.3812e-02,
5.1095e-03, 4.7326e-02, 1.3975e-01, -2.6735e-02, -6.9531e-02,
2.1803e-01, 3.1915e-02, 9.5131e-02, 7.6577e-02, 1.5405e-01,
4.3387e-02, 5.5865e-02, -3.0148e-01, 4.3004e-03, 6.5930e-02,
5.8391e-02, 6.8175e-02, 1.6588e-01, 1.6431e-02, 1.5859e-01,
1.2959e-01, 4.8708e-02, 3.2759e-02, 7.5881e-02, 4.3172e-02,
6.7582e-02, 6.7494e-02, -9.6171e-03, 1.6678e-01, -1.7962e-02,
1.2000e-01, 4.9883e-02, 7.9275e-02, 4.2137e-02, 5.1357e-02,
1.4634e-01, 1.5458e-01, 9.1801e-02, -4.4290e-03, 6.7601e-02,
7.9916e-02, 4.2656e-02, 6.0210e-02, 1.3525e-01, 2.1626e-02,
2.5922e-02, 5.4564e-03, -2.0931e-02, 1.0214e-02, 3.4089e-03,
-1.8369e-02, 4.0166e-02, 1.0205e-01, 5.4913e-02, -4.6868e-02,
1.9856e-02, 3.8161e-02, 1.7495e-01, 1.3782e-01, 1.9456e-01,
8.8143e-02, 1.1202e-01, 2.9118e-02, 1.5684e-02, 1.2871e-01,
-4.8440e-03, 2.8776e-02, 3.9046e-02, 9.1574e-02, 6.3592e-02,
-9.4818e-02, 1.0447e-01, 2.2536e-02, 1.0008e-01, 1.0121e-02,
-1.0201e-03, 9.7983e-02, -8.0122e-02, 2.3905e-02, 1.2935e-01,
6.9153e-02, 3.3950e-01, 2.0590e-01, 3.5349e-03, 8.6245e-02,
1.1532e-01, -1.6783e-02, 1.2795e-01, 3.9450e-02, -3.9157e-02,
-2.5824e-02, 7.4569e-02, 5.1943e-02, 4.2780e-02, 1.0088e-02,
1.9297e-01, 6.8859e-03, 1.6213e-01, 2.8672e-01, 2.7916e-03,
-4.2865e-02, 1.4515e-01, -1.2121e-01, 6.9745e-02, 7.2056e-02,
-7.3604e-02, 1.2499e-01, -2.2019e-01, -2.5429e-02, -4.3235e-02,
7.4061e-02, 5.9903e-02, -9.1371e-03, 5.5870e-02, 7.8023e-02,
-1.9396e-01, 1.3898e-01, 5.1679e-02, 2.8125e-01, 7.8004e-02,
4.1390e-01, 6.0338e-02, 2.0851e-01, 9.6949e-02, -2.7058e-02,
5.3432e-02, 1.4521e-02, 8.0204e-02, 1.1105e-01, 2.2297e-01,
-2.8141e-01, 3.2592e-02, 8.9051e-02, 1.3268e-01, -9.0669e-02,
1.2267e-01, 9.6171e-03, -7.9379e-02, 1.2216e-01, 3.8192e-02,
-5.2283e-02, 8.0657e-02, -2.0727e-02, -7.4404e-02, -1.0264e-01,
-1.6860e-02, -2.9043e-02, -1.8584e-01, -6.3137e-03, -9.0904e-03,
3.7404e-02, -9.2234e-02, 2.0131e-01, 2.3054e-01, -6.8994e-02,
1.0964e-01, 1.1900e-01, 1.9849e-01, 3.1166e-02, 9.7711e-03,
1.6341e-02, 1.2533e-01, 3.0742e-02, 1.4204e-01, 1.2264e-01,
1.0973e-01, 1.6412e-01, -8.4170e-02, 9.7121e-02, 2.0784e-01,
1.1131e-01, 8.0120e-02, 1.2255e-01, 1.9254e-01, -2.3141e-02,
4.6395e-02, -6.3031e-03, 7.4744e-02, -4.1130e-02, 2.4498e-01,
1.5875e-01, 6.9272e-02, 2.2189e-02, -5.3882e-02, 9.0668e-02,
2.2369e-01, -3.4872e-02, 1.2988e-01, 7.7180e-02, -4.6117e-02,
6.5093e-02, -1.2873e-02, -1.7517e-02, -3.1470e-02, 1.5512e-01,
-1.7699e-02, 5.4292e-02, 1.1466e-01, 1.9494e-02, -1.1747e-02,
1.1432e-01, -1.4345e-01, 1.6700e-01, 6.2729e-02, -1.5905e-02,
7.6338e-02, 1.1849e-01, -1.5245e-02, -2.0953e-02, 3.2459e-02,
-3.5718e-02, 9.4713e-02, 6.1315e-02, -6.9358e-02, 1.9439e-02,
2.9759e-01, -3.9084e-02, 1.7780e-01, 6.6585e-02, 7.6704e-03,
4.1163e-02, 1.6397e-01, 1.5233e-03, 2.3547e-01, 4.2645e-02,
-1.6888e-02, 1.0941e-01, -3.1624e-01, 6.4766e-02, 6.7866e-02,
9.8873e-02, 1.7081e-01, 2.2533e-01, 1.4734e-01, 6.4730e-02,
7.6160e-02, 2.3864e-02, 1.8265e-03, 8.8731e-03, 9.1875e-02,
2.4801e-02, 2.2861e-02, 2.7442e-01, 5.6415e-02, 9.6273e-02,
4.4071e-02, 1.1573e-01, 8.5102e-02, 1.4514e-01, -1.9779e-02,
1.0150e-01, 6.0270e-02, 3.7622e-02, 1.8468e-02, -2.7987e-03,
-2.4019e-02, 4.3530e-02])), ('features.25.weight', tensor([[[[ 0.0054, 0.0146, 0.0159],
[ 0.0020, 0.0050, -0.0082],
[-0.0077, -0.0133, -0.0009]],
[[ 0.0009, 0.0026, -0.0179],
[-0.0028, 0.0089, 0.0009],
[-0.0077, -0.0032, -0.0138]],
[[-0.0203, 0.0038, 0.0233],
[-0.0333, -0.0139, -0.0077],
[-0.0296, -0.0253, -0.0067]],
...,
[[-0.0029, -0.0093, -0.0141],
[-0.0137, -0.0133, -0.0230],
[-0.0239, -0.0241, -0.0202]],
[[-0.0075, -0.0031, -0.0037],
[ 0.0007, 0.0067, -0.0083],
[ 0.0207, 0.0190, 0.0070]],
[[ 0.0050, 0.0191, -0.0082],
[-0.0089, 0.0033, -0.0161],
[-0.0162, -0.0053, -0.0196]]],
[[[-0.0143, -0.0072, 0.0019],
[-0.0154, -0.0157, 0.0037],
[ 0.0060, 0.0080, -0.0030]],
[[ 0.0084, 0.0007, 0.0058],
[-0.0094, -0.0147, 0.0010],
[-0.0163, -0.0218, -0.0196]],
[[-0.0150, -0.0138, -0.0243],
[-0.0090, -0.0232, -0.0245],
[ 0.0201, 0.0118, -0.0017]],
...,
[[ 0.0014, 0.0124, 0.0169],
[-0.0174, -0.0152, -0.0134],
[-0.0586, -0.0394, -0.0209]],
[[-0.0083, -0.0095, 0.0062],
[-0.0218, -0.0035, 0.0027],
[-0.0099, -0.0080, -0.0204]],
[[ 0.0042, -0.0063, -0.0111],
[ 0.0095, -0.0038, -0.0076],
[-0.0058, -0.0011, -0.0124]]],
[[[-0.0008, 0.0266, 0.0602],
[-0.0102, 0.0008, 0.0468],
[-0.0084, 0.0012, 0.0040]],
[[ 0.0007, 0.0140, 0.0047],
[-0.0036, -0.0022, -0.0030],
[-0.0054, -0.0150, -0.0343]],
[[-0.0253, -0.0151, -0.0118],
[-0.0126, -0.0113, -0.0243],
[-0.0111, -0.0050, -0.0125]],
...,
[[-0.0097, 0.0003, 0.0090],
[-0.0080, -0.0091, -0.0187],
[-0.0064, -0.0049, -0.0128]],
[[-0.0123, -0.0226, -0.0308],
[-0.0199, -0.0269, -0.0231],
[-0.0132, -0.0178, -0.0121]],
[[-0.0214, -0.0049, -0.0209],
[-0.0250, 0.0056, -0.0103],
[-0.0102, -0.0091, -0.0155]]],
...,
[[[-0.0006, -0.0171, -0.0199],
[ 0.0011, -0.0085, -0.0133],
[-0.0016, -0.0123, -0.0200]],
[[ 0.0303, 0.0081, -0.0118],
[ 0.0188, -0.0022, -0.0165],
[ 0.0254, -0.0033, -0.0001]],
[[-0.0041, -0.0070, -0.0112],
[ 0.0143, 0.0028, -0.0008],
[ 0.0038, -0.0028, 0.0024]],
...,
[[ 0.0086, 0.0165, 0.0048],
[-0.0209, -0.0316, -0.0363],
[-0.0291, -0.0417, -0.0445]],
[[ 0.0222, -0.0010, 0.0027],
[-0.0190, -0.0347, -0.0347],
[-0.0042, -0.0317, -0.0186]],
[[ 0.0114, -0.0022, 0.0061],
[ 0.0168, 0.0044, 0.0101],
[ 0.0080, 0.0049, 0.0097]]],
[[[-0.0042, -0.0080, -0.0050],
[ 0.0112, 0.0133, -0.0005],
[ 0.0132, 0.0261, 0.0144]],
[[-0.0082, -0.0167, -0.0158],
[-0.0116, -0.0144, -0.0041],
[-0.0036, -0.0136, -0.0129]],
[[ 0.0260, 0.0180, 0.0192],
[ 0.0003, -0.0128, 0.0041],
[-0.0014, -0.0116, 0.0089]],
...,
[[ 0.0384, 0.0261, 0.0270],
[ 0.0118, -0.0035, 0.0213],
[-0.0278, -0.0467, -0.0289]],
[[ 0.0064, -0.0027, -0.0050],
[ 0.0090, -0.0081, -0.0223],
[-0.0023, -0.0236, -0.0283]],
[[ 0.0061, 0.0172, 0.0075],
[ 0.0143, 0.0066, 0.0159],
[ 0.0055, 0.0070, 0.0033]]],
[[[ 0.0028, 0.0089, 0.0167],
[ 0.0039, 0.0086, 0.0119],
[ 0.0040, -0.0093, -0.0078]],
[[ 0.0011, -0.0172, -0.0111],
[ 0.0073, 0.0018, 0.0022],
[ 0.0129, 0.0011, 0.0021]],
[[-0.0057, -0.0117, -0.0223],
[-0.0076, -0.0101, -0.0132],
[-0.0088, -0.0165, -0.0112]],
...,
[[-0.0088, -0.0079, -0.0198],
[-0.0081, -0.0135, -0.0198],
[-0.0174, -0.0198, -0.0073]],
[[-0.0037, -0.0158, -0.0126],
[-0.0023, -0.0100, -0.0121],
[-0.0047, -0.0092, -0.0112]],
[[-0.0029, 0.0096, 0.0150],
[-0.0100, -0.0038, -0.0126],
[-0.0319, -0.0140, -0.0118]]]])), ('features.25.bias', tensor([ 5.4550e-02, 9.0395e-02, 1.3763e-01, 1.5317e-01, 8.7397e-02,
8.9615e-03, 2.7959e-01, 6.4551e-02, 8.5763e-02, -3.5633e-02,
-1.5747e-02, 1.9410e-01, 1.7428e-01, 5.2252e-02, 4.4374e-02,
-6.5025e-03, 9.1678e-02, 2.8751e-03, -3.0198e-02, -1.3081e-02,
2.0216e-02, 9.1585e-04, -4.1244e-02, -4.1938e-02, 4.5144e-02,
2.5334e-01, 1.8160e-01, 2.0175e-01, 4.2058e-02, 4.6474e-02,
-1.0115e-01, -2.6504e-02, -1.6827e-02, -5.6741e-02, -4.9521e-02,
-1.9055e-02, 5.8156e-03, 2.0239e-01, 7.5700e-02, -9.6130e-03,
1.8989e-01, 1.1588e-01, 1.7024e-02, 1.3586e-01, 4.6662e-02,
1.1918e-01, -2.0226e-02, 3.0292e-02, 1.6732e-01, 7.1087e-02,
3.0811e-02, -6.2468e-02, 3.4584e-02, 1.3256e-01, 1.5036e-01,
2.3591e-02, -5.9546e-02, 2.5650e-02, 6.5997e-02, 2.8193e-01,
6.2886e-02, -4.9675e-02, 1.0511e-01, 1.9264e-02, -2.6461e-02,
-5.2579e-03, 9.6973e-03, 2.4467e-01, 2.3271e-02, -5.5900e-02,
1.8553e-01, 1.1481e-01, -2.3156e-02, 1.2930e-01, -5.0827e-02,
7.6912e-03, 2.8213e-01, 1.2676e-01, -3.2407e-02, 3.5874e-02,
-2.0872e-01, 1.0859e-01, -2.3099e-02, 1.1137e-02, 1.2484e-01,
3.1466e-02, 8.2460e-02, 2.8952e-02, 1.3819e-01, 2.1896e-01,
4.8252e-02, 9.4631e-02, 3.3748e-02, -2.0707e-02, 4.0461e-02,
1.4462e-01, 1.3791e-01, 1.4462e-01, 3.3675e-02, 1.6905e-01,
1.1195e-01, -8.6589e-03, 3.0781e-02, 7.1606e-03, 6.9909e-02,
-7.1065e-02, -2.1165e-03, 4.7398e-02, -1.2113e-01, -7.5211e-02,
7.8080e-02, 5.4038e-03, 1.9229e-01, 5.6261e-02, 2.2445e-02,
2.3410e-01, 1.0459e-01, 1.1374e-01, 1.8279e-01, -1.7334e-02,
7.3144e-02, 3.7239e-02, -4.3321e-02, 1.9967e-01, 5.5632e-02,
3.8898e-02, 2.1457e-01, 4.7713e-02, -9.5661e-02, 5.3312e-02,
8.0496e-02, 4.2171e-02, 6.6922e-02, 3.5847e-02, 2.0944e-01,
-2.1141e-02, 1.6862e-01, 2.1875e-01, 9.3951e-02, 1.3485e-01,
1.3305e-01, -1.0092e-01, 9.4168e-03, 7.8153e-02, -7.9164e-02,
6.1590e-02, 6.8566e-02, 9.8991e-02, 1.7955e-01, -8.8558e-02,
-5.9397e-02, -1.6202e-02, -7.2225e-02, 6.5177e-02, -1.9023e-01,
9.4891e-02, 1.5589e-01, 1.4033e-02, 5.6364e-02, 7.5691e-02,
1.8211e-01, 1.7732e-01, 1.8649e-01, 1.3231e-01, 1.2683e-01,
7.2747e-02, 1.0056e-01, -9.3260e-02, -9.5954e-02, 7.0644e-02,
-5.9384e-02, 4.7805e-02, 6.5701e-02, 9.4919e-04, 8.0736e-02,
3.8317e-02, 7.5393e-02, 5.5428e-02, 9.4576e-02, 3.2688e-02,
1.6831e-02, 7.6348e-04, 3.9503e-02, 6.8626e-02, 6.6301e-02,
-3.5059e-04, 5.6684e-02, 2.3321e-02, 1.1641e-01, 4.9491e-02,
9.1192e-02, 5.4002e-02, 2.8751e-03, 9.6218e-02, 1.8073e-01,
1.1155e-02, 1.1938e-01, 3.8680e-02, 1.8039e-01, 1.1379e-01,
1.3083e-01, 6.0210e-02, 6.0117e-03, 4.2065e-02, 1.8790e-02,
2.3845e-02, 4.1124e-02, -4.7766e-02, 9.2979e-02, -1.3242e-02,
2.3420e-02, 6.7011e-02, 5.7222e-02, 1.6712e-01, 9.4531e-03,
-1.0062e-01, 3.1609e-02, 1.1899e-01, 6.2357e-02, 7.6572e-02,
1.6689e-02, -8.9327e-03, -6.6003e-02, 4.8060e-02, 3.1899e-02,
2.1463e-01, 6.5565e-02, 6.4432e-02, 9.3042e-02, -3.6661e-02,
-1.3594e-01, 4.5892e-02, 1.2372e-01, 6.7679e-02, 1.9842e-01,
-8.1975e-02, 5.2893e-02, -1.2591e-03, -1.3093e-02, 6.5494e-02,
4.6469e-02, 1.5098e-01, -1.8719e-01, 1.2957e-01, -3.7854e-02,
6.1765e-02, 1.0871e-01, 1.8761e-01, -6.6187e-02, -3.5769e-03,
8.8650e-02, 4.1342e-02, -1.3272e-02, 9.8334e-02, -7.8020e-02,
6.2373e-03, 1.1495e-01, 4.8916e-02, -7.8232e-02, -3.4400e-02,
-1.7616e-01, 5.2329e-02, 8.1552e-03, 9.3582e-02, 8.4386e-02,
3.0095e-02, 9.5031e-02, 1.9144e-01, 1.1496e-01, 2.0283e-02,
-1.1171e-01, -5.2628e-02, -1.5148e-01, 3.3872e-03, 8.6827e-03,
6.4258e-02, -3.2221e-01, 1.1242e-01, 8.9261e-02, 1.3775e-01,
3.8761e-02, -1.7208e-02, -6.9449e-03, 9.2393e-02, 1.3051e-01,
-3.7832e-02, 9.8489e-02, 9.9271e-02, -6.4884e-02, 6.0832e-02,
-2.1743e-02, -7.5061e-02, 4.1395e-02, 1.0777e-01, 9.0592e-02,
7.0215e-02, -1.6673e-02, -5.3364e-03, 1.6847e-02, 3.4441e-02,
2.6234e-02, 1.1366e-02, 1.3685e-01, 5.7468e-02, -6.5293e-02,
-1.9745e-01, 9.0019e-02, 4.7977e-02, -5.3513e-02, 2.5838e-01,
-8.4486e-02, -1.1486e-01, -3.9687e-02, 2.0573e-01, 2.2821e-02,
-4.6728e-03, 9.2958e-02, 8.9081e-02, 9.2180e-02, 3.2073e-02,
1.0805e-01, -5.4812e-02, 2.7815e-02, 1.5807e-02, -1.1218e-01,
1.1385e-01, 2.3944e-01, 3.6883e-02, 1.8457e-01, 8.3830e-02,
-2.7083e-02, 8.8215e-02, 4.3873e-02, 2.4400e-03, 5.5964e-02,
5.2942e-03, 9.9795e-02, -2.7955e-02, 1.5199e-02, 1.0434e-01,
1.1015e-01, 2.0031e-01, 5.2810e-02, 3.0435e-01, 4.7261e-02,
-2.2388e-02, 6.2388e-03, -1.2972e-02, 6.0259e-02, 6.2335e-02,
4.0839e-02, 2.7721e-01, 3.1893e-02, 5.4768e-02, 1.0697e-01,
-6.3277e-02, 8.3997e-02, 4.1833e-02, -5.5414e-02, 7.7578e-02,
-5.8501e-02, 1.3513e-02, 1.7627e-01, 4.7262e-02, 1.0093e-01,
8.7916e-02, 3.7135e-02, -3.6307e-02, 5.7830e-02, 1.5483e-01,
-2.3949e-02, 8.3196e-02, 1.1404e-01, 2.0931e-02, 1.3461e-01,
-6.8469e-02, 7.8940e-02, 2.5084e-02, 2.1714e-01, -5.0419e-02,
1.1919e-01, 8.5465e-02, 1.9923e-04, 6.5783e-02, 3.7303e-03,
4.3919e-02, -2.5169e-02, -1.4702e-02, -1.0312e-01, 6.6326e-03,
8.6184e-02, -1.6896e-03, 3.6182e-02, 2.6042e-02, -6.6342e-02,
-4.2176e-03, 1.6597e-01, -1.2914e-01, 1.1083e-01, 6.3667e-02,
-3.1632e-02, 1.9584e-01, -1.8214e-02, -2.9678e-02, 6.5210e-02,
2.5564e-01, 2.2309e-01, 9.1593e-02, 1.3563e-01, 1.3853e-02,
-1.3630e-02, 3.7037e-02, 3.7722e-02, -6.3225e-02, 1.6186e-01,
7.1135e-02, -2.0982e-01, -3.4710e-02, -3.3498e-02, 4.9611e-02,
-5.3892e-02, -5.0108e-02, -2.1911e-02, 2.8282e-01, 1.7638e-01,
1.3328e-01, 4.3874e-02, 1.1706e-01, 1.5481e-01, 5.7874e-02,
-7.9418e-03, 1.0832e-01, -2.2828e-02, -2.9257e-02, 5.5756e-02,
8.0955e-02, -2.7904e-02, 6.1885e-03, -5.1994e-02, 8.8473e-02,
1.0152e-01, -8.3706e-02, 5.1012e-02, 1.1241e-01, 2.4864e-03,
-6.3696e-02, 1.1552e-01, 9.9662e-02, 8.5439e-02, -3.3692e-03,
5.4528e-02, 1.2733e-01, -8.2801e-03, 2.4022e-01, 3.2309e-01,
-1.4701e-02, 7.4046e-02, 2.3383e-02, 2.4219e-02, 7.1903e-02,
-3.6535e-02, -4.2770e-02, -5.4112e-03, -1.0857e-01, 1.7546e-01,
-1.1867e-01, 1.4469e-01, 1.5331e-01, 9.0448e-02, 6.6701e-02,
2.7743e-02, 2.3978e-02, 1.4492e-01, 2.2927e-03, -1.6870e-02,
2.0847e-01, 6.6245e-02, -4.1411e-02, 1.1131e-01, -7.3963e-02,
1.0207e-01, -1.2472e-01, -3.8386e-03, 3.8750e-02, -2.5393e-02,
1.4188e-01, 1.9498e-02, 1.2334e-01, 1.6884e-01, 8.8518e-02,
-6.9111e-03, 1.0803e-01, -7.8050e-02, 1.2133e-01, 5.1863e-02,
-1.4520e-04, 5.5940e-02, 1.6511e-01, 7.0721e-02, 1.0072e-02,
2.5335e-02, 3.5958e-02, 9.8417e-02, -5.2074e-02, -3.5101e-02,
2.1566e-01, 8.0015e-02, 1.4218e-01, 2.4394e-01, -1.3442e-02,
9.7152e-02, 2.0463e-03])), ('features.28.weight', tensor([[[[ 1.0534e-02, 7.4071e-03, 1.7693e-02],
[ 9.6142e-03, 8.1172e-03, 1.6695e-02],
[ 9.4952e-03, 5.4519e-03, 2.4258e-03]],
[[-3.2529e-02, -4.1647e-02, -4.6546e-02],
[ 2.0441e-03, 4.1343e-03, -2.9100e-02],
[-9.5072e-03, 9.0524e-03, -2.0988e-02]],
[[-9.9875e-03, 8.5819e-03, 1.5936e-02],
[ 1.1354e-02, 4.4704e-02, 4.0895e-02],
[ 3.9529e-03, -2.2457e-02, 9.6313e-03]],
...,
[[ 1.1221e-03, 1.7718e-02, -2.0417e-03],
[-1.0568e-03, -1.4011e-02, -6.8687e-03],
[ 9.3055e-03, 1.2550e-02, 8.0547e-03]],
[[-8.3888e-03, -4.3527e-03, -1.6764e-02],
[-2.7760e-02, -4.0867e-02, -3.7720e-02],
[-8.7290e-03, -1.5348e-02, -1.5408e-02]],
[[ 2.3846e-02, 8.0078e-03, -1.4325e-02],
[ 1.3081e-02, 6.5976e-03, -1.6950e-03],
[-2.3387e-03, 6.2091e-03, 1.0908e-03]]],
[[[-4.5460e-02, -2.7670e-02, -2.9360e-02],
[-2.9663e-02, -1.9418e-02, -2.0720e-02],
[-4.1865e-02, -3.7580e-02, -3.7262e-02]],
[[-2.7951e-02, -7.4118e-04, 5.3942e-03],
[-2.1042e-02, -9.5724e-03, -9.3760e-03],
[-2.3509e-02, 2.6612e-03, -1.5148e-02]],
[[ 2.0806e-02, 1.2486e-02, 1.4442e-02],
[ 2.3443e-02, 1.6937e-02, 2.0494e-02],
[ 4.2165e-02, 3.3524e-02, 1.9346e-02]],
...,
[[ 2.9569e-02, 2.4817e-02, 3.1889e-02],
[ 4.7988e-02, 2.7031e-02, 3.4781e-02],
[ 4.6770e-02, 4.3886e-02, 4.0007e-02]],
[[ 9.1720e-04, 8.3590e-03, 1.2437e-02],
[ 5.9093e-03, 1.8841e-02, 6.2183e-03],
[-2.0215e-03, 9.7654e-04, -4.4745e-03]],
[[ 5.1511e-03, 6.5026e-03, 1.5780e-02],
[-6.5917e-03, -1.2477e-02, 1.0469e-03],
[ 3.7334e-03, 1.7589e-04, 1.5428e-02]]],
[[[ 1.9287e-02, 1.3897e-02, 1.8621e-02],
[-6.2282e-03, -1.4324e-02, -1.2734e-02],
[-2.2646e-03, -1.5617e-02, 4.8161e-03]],
[[ 1.1769e-02, 1.3531e-02, -1.0082e-02],
[-2.3015e-02, -2.2524e-02, -2.3136e-02],
[-2.3968e-02, -2.3774e-02, -1.5165e-02]],
[[ 3.7307e-03, -3.2668e-03, 9.5886e-04],
[ 1.0051e-04, -2.9065e-03, 3.0753e-03],
[-9.5781e-04, 3.6041e-03, -6.5328e-03]],
...,
[[-1.6190e-02, -1.3174e-02, -3.2479e-03],
[-8.8058e-03, -9.6491e-03, -6.0777e-03],
[-6.1347e-03, -9.1103e-03, -1.0093e-02]],
[[-5.6735e-03, -8.7211e-03, -1.3515e-02],
[-8.8690e-03, -9.1880e-03, -1.5566e-02],
[-2.0200e-02, -2.4437e-02, -1.8967e-02]],
[[ 1.0763e-02, 2.1825e-02, 1.9055e-02],
[ 2.1035e-02, 1.7930e-02, 1.2347e-02],
[-4.6210e-03, -3.7991e-03, -8.0422e-03]]],
...,
[[[-9.6637e-03, 1.3108e-04, -1.7421e-02],
[-5.2615e-03, -9.2083e-03, -1.4362e-02],
[-1.6242e-02, -1.5424e-02, -2.1202e-02]],
[[ 1.5105e-02, 9.4753e-03, 1.7052e-02],
[ 1.2088e-02, 4.0698e-03, 1.8819e-02],
[ 2.6073e-02, 1.9123e-02, 2.5986e-02]],
[[ 1.8388e-02, -1.8862e-03, 4.0384e-03],
[ 1.0798e-02, 4.4443e-03, 1.3677e-02],
[ 6.4806e-02, 5.4876e-02, 3.7040e-02]],
...,
[[-1.7374e-02, -2.4695e-03, -2.3151e-02],
[-1.1937e-03, 1.8140e-02, 1.1035e-02],
[-6.8690e-03, 1.4612e-02, -8.4875e-03]],
[[-1.3237e-02, -1.4303e-02, -1.7158e-02],
[-7.1642e-03, -6.3557e-03, -2.5595e-02],
[-6.9908e-03, -7.5897e-03, -2.3305e-02]],
[[-1.7111e-02, -1.9325e-02, -2.7272e-02],
[-2.7464e-02, -1.7646e-02, -2.5215e-02],
[-2.3997e-02, -1.4346e-02, -2.7065e-02]]],
[[[ 1.6478e-02, 1.9886e-02, -1.4327e-03],
[ 3.6819e-02, 2.7425e-02, -1.3119e-02],
[ 3.2025e-02, 1.2257e-02, 2.9387e-03]],
[[-1.0890e-02, 1.7128e-02, 4.2529e-04],
[-1.4745e-02, 1.3635e-02, 1.0135e-02],
[-6.3535e-03, -6.8742e-03, 1.0029e-02]],
[[ 4.3664e-03, 2.3589e-02, 6.2146e-05],
[ 9.9697e-03, 1.1119e-02, 2.5601e-02],
[-7.7296e-04, -2.0062e-03, 1.6649e-02]],
...,
[[-4.3500e-03, -1.3366e-02, -1.1923e-02],
[-2.9168e-03, -2.3801e-02, -1.1260e-02],
[-1.1050e-03, -2.4048e-02, -7.3301e-03]],
[[-6.9538e-03, -8.2345e-03, -1.2369e-02],
[-1.4201e-02, 3.5598e-03, -8.6927e-04],
[-1.2974e-02, -1.4085e-04, -1.2429e-02]],
[[ 1.9372e-02, 1.4293e-02, 3.4951e-03],
[ 5.7221e-03, 6.3302e-03, -1.9670e-03],
[ 9.1195e-03, 1.1653e-02, 6.4557e-03]]],
[[[ 1.3058e-02, 2.9712e-02, 4.4140e-02],
[ 1.4423e-02, 1.3731e-02, 2.9398e-02],
[-1.9841e-02, -1.9170e-02, -2.6967e-02]],
[[-2.1299e-02, 1.7899e-02, 2.0077e-02],
[ 3.9332e-03, 1.8979e-02, 1.2482e-02],
[-1.4817e-03, -1.5337e-03, -5.3633e-03]],
[[-6.5409e-03, -1.7854e-03, -1.6178e-02],
[-1.0027e-02, 1.4759e-03, 8.2547e-03],
[-9.2373e-03, 1.2668e-03, -5.2832e-03]],
...,
[[ 2.4018e-02, 1.8149e-02, 1.7612e-02],
[ 4.7869e-03, -1.8983e-02, -8.6058e-03],
[-1.9520e-02, -2.5497e-02, -9.7797e-03]],
[[-2.1928e-02, -2.1271e-02, -2.0940e-02],
[-1.2071e-02, 9.7580e-04, 7.5258e-03],
[ 6.4582e-03, 2.5957e-02, 2.1490e-02]],
[[ 5.6633e-03, -1.3606e-03, 9.2883e-03],
[-1.2347e-02, -1.6123e-02, -1.1777e-02],
[-1.4318e-02, -2.8732e-02, -1.4232e-02]]]])), ('features.28.bias', tensor([-5.2319e-02, 1.0119e-01, 8.1453e-02, 1.5000e-02, 6.2639e-02,
5.0773e-02, -2.5211e-02, 1.5447e-01, 6.8224e-02, 7.2643e-02,
-3.2789e-02, -8.0750e-02, 6.1024e-02, -5.5529e-02, -2.4767e-02,
-9.8564e-02, -2.5021e-02, -9.6584e-02, -3.3062e-02, -9.7797e-02,
6.6945e-02, -9.5664e-02, -4.4807e-02, 1.6170e-02, 8.8866e-02,
3.9108e-02, 1.2072e-02, 1.0541e-01, 6.9712e-02, -8.9803e-02,
3.6094e-02, 8.0960e-02, 8.5670e-02, -5.1694e-03, -9.0972e-02,
-2.5987e-02, -7.9560e-02, -1.4550e-02, -8.5629e-02, -4.4735e-02,
-5.7599e-02, 4.6076e-04, 1.6221e-01, 1.3436e-01, 1.2053e-02,
1.7653e-02, 2.8465e-02, 1.4352e-01, -2.9918e-02, -1.0108e-01,
2.8936e-02, -4.1771e-02, -2.6316e-02, 7.9220e-02, 6.0954e-02,
1.1343e-01, 1.6546e-02, 5.0513e-02, 3.4709e-02, 1.8296e-02,
3.8976e-02, -5.4446e-03, 3.4109e-02, 9.6858e-03, -2.6728e-02,
3.4159e-02, 1.8631e-02, -5.9770e-02, 2.9009e-02, -2.3023e-02,
-7.9853e-02, -7.7489e-02, 9.2084e-02, -8.2976e-02, 1.3697e-01,
-4.0538e-03, -4.4704e-02, 1.0316e-02, 2.0040e-01, -3.5939e-02,
-8.7572e-02, -3.1023e-02, -6.1228e-02, -1.4180e-02, -4.9270e-02,
-4.9876e-01, 1.7821e-01, 9.2966e-02, -2.3813e-02, -8.2073e-02,
-5.5366e-02, -2.6926e-02, 3.8809e-02, 3.8208e-01, 1.0139e-01,
-6.0166e-02, 1.4986e-02, -1.6999e-01, -1.1088e-01, -1.8091e-03,
-4.0059e-02, -3.1694e-02, 1.8829e-01, 3.6370e-02, -1.1815e-01,
1.6316e-01, 7.9048e-02, -1.9583e-01, 1.0821e-01, 8.3884e-02,
1.2717e-01, -1.0928e-01, 1.4351e-01, 1.7559e-02, 4.4746e-02,
-1.2567e-02, -2.1301e-02, 1.6970e-01, -2.2387e-02, -1.6801e-02,
1.3783e-01, 2.9619e-02, 1.1342e-01, -1.1338e-01, 7.4107e-03,
1.0990e-02, -1.4393e-02, 1.3743e-01, 2.5126e-02, 7.8620e-02,
3.9795e-02, 5.5431e-02, 5.8049e-02, -3.8614e-02, 1.2411e-01,
-4.4721e-02, 6.2589e-02, 4.1232e-02, 7.2544e-03, -4.9999e-02,
1.2566e-02, 2.0084e-02, 5.8730e-02, -1.0603e-01, -1.3959e-01,
1.5580e-01, 5.2214e-02, 3.0000e-01, 2.8463e-02, 4.3451e-02,
2.1800e-01, 1.3818e-01, 3.8503e-02, 7.2682e-03, 7.4285e-02,
8.0416e-02, -2.0947e-02, 5.4322e-02, -3.2007e-02, 6.7700e-03,
4.5463e-02, 3.8825e-02, -2.3935e-02, 1.4184e-01, 8.7481e-02,
1.6047e-01, -8.6290e-02, 1.4325e-01, 2.7610e-02, -3.8653e-02,
-1.6029e-02, 1.4460e-01, -1.4957e-02, 1.2465e-01, 8.1046e-02,
2.1481e-01, -4.7877e-02, 6.1483e-02, 5.8536e-02, 6.2318e-02,
-3.6106e-02, -3.6376e-02, 1.2400e-01, 1.5077e-01, -4.8957e-02,
4.6412e-02, 8.5859e-02, -2.4308e-02, 3.0885e-02, 5.8336e-02,
-8.9022e-02, 1.9121e-01, 1.3998e-01, 3.0894e-02, -3.0751e-02,
1.5327e-01, -6.7166e-02, -3.4163e-02, -2.3023e-02, -2.0684e-01,
8.4709e-03, -7.6189e-02, -5.5113e-02, -2.0514e-03, 1.1287e-01,
-1.7615e-01, -5.4491e-02, -3.2664e-02, -7.6296e-02, 1.7682e-02,
5.8436e-02, 6.1951e-02, 8.3803e-02, 3.1652e-02, 2.7920e-02,
-7.4084e-02, 5.6317e-02, -9.8802e-02, 8.0819e-02, 2.2038e-02,
9.1505e-03, -1.5406e-01, 7.9764e-03, -2.0910e-01, 1.6293e-02,
2.7683e-01, 7.0229e-02, 3.3388e-02, -3.9456e-02, -1.6532e-01,
1.5774e-02, -6.5307e-02, 8.4644e-02, 1.5086e-01, 2.6687e-02,
7.0895e-02, 6.9512e-02, 2.0569e-01, 6.1195e-02, 5.1097e-02,
-1.0382e-01, 1.1823e-01, 3.7355e-02, -8.1613e-02, -1.1839e-01,
-1.1147e-01, 2.5252e-02, -2.8725e-02, -7.3916e-02, 1.7876e-01,
2.6345e-02, 5.9095e-02, 3.2911e-01, -2.2924e-01, -8.3702e-02,
5.7431e-02, -5.8956e-02, 2.6202e-02, 4.2381e-02, -8.0436e-02,
1.0106e-01, -6.9901e-03, 6.6512e-02, 4.0822e-02, 4.3963e-02,
3.9228e-02, 4.9757e-03, 1.2105e-02, 6.0183e-02, -1.3172e-02,
2.3132e-01, -7.9290e-02, 1.2083e-01, 1.7964e-02, 9.2115e-03,
-2.3823e-01, 2.5518e-02, -2.7091e-02, 1.0960e-02, -6.5852e-02,
1.1255e-01, 5.6381e-02, 2.1800e-01, 7.6682e-02, 1.2805e-01,
7.2715e-02, 1.1211e-01, 9.9145e-02, -1.3543e-01, -4.7893e-02,
-2.0325e-01, 1.9199e-01, -6.0506e-04, -4.2153e-02, 6.4415e-03,
6.3642e-02, -2.2402e-02, 3.6096e-02, -1.1031e-02, -1.6002e-02,
-6.2441e-02, -5.7279e-02, 2.1078e-02, -7.9140e-02, 1.1161e-02,
2.1650e-01, 1.2846e-02, -1.7665e-02, -6.6584e-02, 6.8555e-02,
1.5757e-01, 1.2694e-02, -1.1466e-01, 6.6131e-02, -5.4128e-02,
-9.5659e-02, 1.4357e-02, -3.6948e-03, -1.0830e-01, 8.7277e-02,
-7.7305e-03, 1.4493e-01, -2.1617e-02, 3.1007e-02, -3.6226e-02,
1.8052e-02, -3.7769e-02, 1.0537e-01, 2.0080e-01, 1.0989e-01,
1.4839e-01, -1.1903e-01, 3.9451e-02, -1.0955e-01, -1.8305e-02,
5.6299e-02, 7.2543e-02, 8.4539e-02, -9.0520e-02, 5.7590e-02,
6.9237e-02, 2.8184e-02, 1.4489e-01, 1.4316e-01, 1.4845e-02,
-2.3454e-02, 1.8365e-02, -6.7493e-02, -7.4628e-03, -5.1381e-02,
-3.1398e-02, -2.7128e-02, -9.5707e-02, 7.7486e-02, 1.0734e-01,
-9.8759e-02, 1.7754e-03, 3.4874e-02, -7.3016e-02, 6.3602e-02,
-1.1928e-01, 8.9537e-02, -4.7117e-02, -5.0695e-02, 2.7185e-02,
1.0082e-01, 3.0307e-02, 3.0656e-02, -4.1152e-02, 7.0813e-02,
1.2294e-02, -6.2558e-02, 3.5055e-01, 7.7277e-02, -7.9570e-03,
2.2271e-02, -3.5841e-02, -1.1285e-01, 2.2260e-02, 6.0310e-02,
-1.3940e-01, -5.1366e-03, -9.0632e-02, -2.6710e-01, -6.3740e-02,
-3.0285e-04, -2.6488e-02, -4.9796e-03, -1.3462e-01, 1.7971e-01,
-9.8749e-02, -6.8532e-02, -4.1654e-02, 5.7410e-02, -1.8544e-02,
-3.7313e-02, -1.5468e-02, -2.2628e-02, -4.0162e-02, -5.3857e-02,
1.3850e-01, 1.1608e-01, 1.7646e-02, -7.2828e-04, 3.0746e-01,
-1.2535e-01, 4.0908e-02, 1.8459e-02, 5.0482e-02, -7.4063e-02,
1.1109e-01, 5.5451e-04, 1.1971e-01, -1.5124e-01, 2.6552e-01,
7.6453e-02, -9.4802e-02, -1.0916e-01, -1.4684e-01, 2.2458e-01,
2.3925e-01, 5.7071e-02, 4.3034e-02, 5.1031e-02, 2.5769e-02,
-3.4594e-02, 3.9091e-02, 4.3689e-02, 1.0330e-01, -6.5218e-02,
2.1453e-03, 9.1074e-02, -6.0471e-02, -4.5308e-02, 2.8054e-02,
-4.1451e-02, -7.5289e-02, 1.7288e-02, 1.0439e-01, 9.7880e-02,
8.3670e-02, -2.9836e-02, -8.0140e-02, -1.6872e-02, -3.0255e-02,
-1.9757e-02, -1.0094e-01, -1.0232e-01, -1.1449e-01, 2.5125e-02,
1.6253e-01, 3.1867e-02, 1.1970e-01, 7.2658e-02, 2.1138e-01,
1.1993e-01, -5.6040e-02, -2.8450e-02, 9.8043e-03, 4.4326e-02,
-5.8429e-02, 1.7908e-02, 1.3068e-01, -5.1915e-02, 5.5971e-02,
-2.3966e-02, 6.3832e-02, 2.2890e-02, 1.1517e-03, 4.2246e-02,
1.5556e-01, 2.7320e-02, -4.0930e-02, -1.5577e-01, -1.1518e-01,
2.6389e-02, -1.2672e-01, 6.6052e-02, -3.8055e-02, 7.2314e-02,
-7.6355e-02, -5.4386e-02, 6.3691e-02, -2.6207e-02, -1.7572e-01,
7.7995e-02, -4.3244e-02, 6.0467e-02, 6.3842e-02, -2.9012e-02,
8.9383e-02, -1.1529e-01, 1.7078e-01, 2.2612e-02, -7.9203e-03,
1.6124e-02, 8.9942e-02, -1.5939e-02, 2.0508e-01, 3.6991e-02,
-1.0916e-01, 1.1201e-01, 5.5365e-03, -1.0630e-01, 1.1302e-01,
-2.4656e-02, -9.0783e-02, 2.6274e-02, 2.3396e-01, 1.3305e-01,
2.7204e-04, -4.3843e-02])), ('features.30.weight', tensor([[[[-1.0497e-02, -1.5691e-02, -1.0513e-02],
[ 7.3480e-03, 1.5552e-02, 2.9257e-03],
[ 1.5002e-02, 3.0373e-02, 2.9092e-02]],
[[-7.0205e-03, -1.5671e-02, -1.1565e-02],
[-1.4363e-02, -1.3795e-02, -2.3963e-02],
[-2.3295e-02, -1.4808e-02, -2.0007e-02]],
[[ 2.4108e-02, 1.9836e-02, 2.2440e-02],
[ 7.9024e-03, 8.5782e-03, 4.2354e-03],
[-1.2663e-03, -1.7474e-03, -1.0438e-03]],
...,
[[-3.6594e-04, 2.3954e-03, 5.2833e-05],
[-1.2250e-02, -6.3877e-03, 2.8226e-03],
[-1.6565e-02, -7.0453e-03, -1.1510e-02]],
[[ 1.9524e-02, 1.1377e-02, 2.7544e-02],
[ 2.1339e-02, 7.4234e-03, 1.0186e-02],
[ 2.3797e-02, -9.5886e-03, -1.4650e-03]],
[[ 1.2593e-02, 1.3650e-02, 7.9043e-03],
[ 2.5032e-03, 1.7411e-02, 5.2096e-03],
[ 2.0734e-03, 2.3740e-02, 1.1073e-02]]],
[[[-1.8390e-03, -1.0865e-02, -7.5843e-03],
[-1.4256e-02, -7.1076e-03, -9.3254e-04],
[-1.7686e-02, -1.4516e-02, -1.1569e-02]],
[[ 9.4251e-04, 1.6333e-02, -4.3094e-03],
[ 3.6085e-03, 1.1873e-02, 7.4424e-03],
[ 5.7383e-03, -1.8583e-03, 9.4521e-03]],
[[-2.9334e-03, -1.2958e-02, -1.8064e-02],
[-4.8047e-03, -1.8562e-02, -3.1525e-02],
[-7.9825e-03, -1.3393e-02, -1.9706e-02]],
...,
[[-2.7356e-03, -1.4201e-02, -1.2284e-03],
[-8.6081e-03, -1.1953e-02, -5.5344e-03],
[-4.7740e-03, -1.8944e-02, -1.7099e-02]],
[[-1.3712e-02, -7.0349e-03, 1.9559e-02],
[-3.7068e-03, 5.3912e-03, 2.4351e-02],
[ 6.8240e-03, 1.5813e-03, 1.0841e-02]],
[[-6.0921e-03, -7.1504e-03, -8.1523e-03],
[-3.8444e-03, 8.7417e-04, 5.7223e-03],
[-1.3369e-02, -3.3398e-03, 2.0138e-03]]],
[[[-1.4791e-02, -9.3280e-03, -8.0480e-03],
[-1.5242e-03, -6.9774e-03, 2.1907e-03],
[ 9.0740e-03, 1.0212e-02, 2.4935e-02]],
[[-2.5386e-02, -1.6944e-02, -1.8967e-02],
[-2.2017e-02, -1.2266e-02, -6.7644e-03],
[-1.6928e-02, -1.0789e-02, -7.1197e-03]],
[[ 8.5892e-03, 2.6223e-02, 3.0058e-02],
[-9.0205e-03, 1.5526e-02, 2.8158e-02],
[-2.4482e-02, -6.5889e-03, 2.0997e-03]],
...,
[[-1.4853e-02, -1.5064e-02, -1.3973e-02],
[ 8.4511e-04, 5.5825e-03, -1.5933e-04],
[ 3.8109e-03, 1.8033e-02, 2.2754e-03]],
[[-1.9244e-03, 9.4264e-03, 3.2201e-02],
[ 2.1682e-02, 1.8095e-02, 3.4150e-02],
[ 3.1240e-02, 2.3811e-02, 4.3272e-02]],
[[-2.2254e-02, -1.1660e-02, -1.8439e-02],
[-2.3109e-02, -2.2658e-02, -1.4568e-02],
[-1.1983e-02, 1.8506e-03, -5.4576e-03]]],
...,
[[[-3.2572e-02, -3.0159e-02, -1.3425e-02],
[-2.3446e-03, -1.2989e-02, -3.9669e-03],
[ 2.7404e-02, 1.1494e-02, 2.3702e-02]],
[[ 2.9039e-02, 3.0188e-02, 2.3692e-02],
[-6.3915e-03, -3.9415e-03, -3.2791e-03],
[-2.8783e-02, -3.3632e-02, -2.3336e-02]],
[[-6.4091e-04, -1.0622e-02, 2.1667e-04],
[-6.5870e-03, -1.8631e-02, -1.6622e-03],
[ 1.7545e-02, 2.1271e-02, 1.7073e-02]],
...,
[[-1.3385e-02, -5.2247e-03, -4.9591e-03],
[-9.9097e-03, -6.1926e-03, -7.9978e-03],
[-7.4954e-03, -9.6863e-03, -6.3085e-03]],
[[-2.7701e-03, -2.2570e-03, 4.4392e-03],
[ 4.8472e-03, -3.3344e-03, 3.5634e-03],
[-3.2524e-03, 6.1525e-03, 6.9243e-03]],
[[-2.8188e-02, -2.5067e-02, -2.8885e-02],
[ 6.1223e-03, 2.3891e-02, -3.0332e-03],
[ 3.8110e-02, 5.4424e-02, 1.2177e-02]]],
[[[-2.7759e-04, -5.2200e-03, -2.7073e-02],
[ 9.5919e-03, -9.8219e-03, -1.7243e-02],
[-7.2445e-03, -1.9470e-02, -2.0567e-02]],
[[ 1.2983e-02, -2.1955e-03, 9.8715e-03],
[ 8.4865e-03, -3.7189e-03, 1.8862e-02],
[ 1.0649e-02, 7.8248e-03, 2.0532e-02]],
[[ 1.2739e-03, 3.6503e-05, 9.6378e-03],
[-5.2499e-03, -1.4339e-02, -6.9089e-03],
[-9.8457e-03, -1.3833e-02, -4.7849e-03]],
...,
[[-4.3422e-02, -4.7683e-02, -3.8055e-02],
[-2.1421e-02, -2.1383e-02, -9.3468e-03],
[-9.8582e-03, 1.7197e-03, 1.1510e-02]],
[[-4.1923e-02, -4.0893e-02, -5.1180e-02],
[-9.2622e-03, -6.0983e-04, -7.6488e-03],
[ 1.0245e-02, 8.0677e-03, 7.2420e-03]],
[[-3.0808e-02, -3.2002e-02, -1.2284e-02],
[-1.1001e-02, -2.0435e-02, -1.2846e-03],
[ 1.7171e-02, 7.5113e-03, 1.4121e-02]]],
[[[-9.0677e-03, -2.0509e-03, -6.2102e-04],
[-2.2303e-02, -2.0009e-02, -1.8061e-02],
[-2.9415e-02, -3.7534e-02, -2.7441e-02]],
[[-1.9086e-03, -5.1035e-03, 3.2521e-03],
[-6.5440e-03, -1.5106e-03, -1.0450e-02],
[-8.5359e-03, -2.0444e-02, -2.0804e-02]],
[[ 3.5132e-02, 3.3223e-02, 2.6994e-02],
[ 1.6727e-02, 8.6008e-03, 1.9395e-04],
[ 6.3749e-03, 3.9616e-04, -4.6348e-03]],
...,
[[ 2.0223e-04, 3.7146e-04, -1.2160e-02],
[ 1.9127e-02, 1.5279e-02, -2.2109e-03],
[ 5.6650e-03, 1.2125e-03, -1.1725e-02]],
[[-1.6715e-02, -1.9481e-02, -5.7449e-03],
[-5.1775e-03, -1.1694e-02, -3.3920e-03],
[ 1.3342e-02, 1.4896e-02, 2.7673e-03]],
[[-2.5258e-02, -2.7444e-02, -1.9807e-02],
[-4.0062e-02, -2.8320e-02, -2.2498e-02],
[ 1.1992e-02, 3.8998e-02, 2.9096e-02]]]])), ('features.30.bias', tensor([ 2.4808e-02, -3.2451e-02, 9.0068e-03, 2.3684e-01, 3.9230e-02,
1.1773e-01, 1.9459e-02, 2.0499e-01, 1.5169e-01, 8.1941e-02,
9.1385e-02, 1.4322e-01, 2.3864e-01, 6.9181e-02, 1.1039e-02,
-1.1903e-02, -7.9656e-02, -8.1591e-02, 1.1165e-01, 3.8083e-01,
2.4376e-01, -7.8311e-02, -3.0780e-02, -1.2566e-01, 6.1238e-02,
-2.9707e-03, 2.0138e-01, 1.8291e-01, 1.6381e-01, 1.9854e-01,
6.7884e-02, 1.3849e-01, 4.8964e-02, 9.1923e-03, 1.3405e-01,
9.7764e-02, 9.6140e-02, 1.8617e-01, -1.7902e-01, -1.6217e-01,
1.2027e-01, -3.5277e-03, 2.0704e-01, 1.1222e-01, 2.6869e-02,
1.0243e-01, 9.1989e-03, 7.9558e-02, -5.6122e-02, 5.9452e-02,
-1.9038e-01, -1.8812e-01, -1.5122e-02, -8.4815e-02, 6.6375e-02,
2.2357e-01, -4.2816e-02, 8.3678e-02, 6.8005e-02, -3.2785e-02,
8.4201e-02, 7.9179e-02, 6.7980e-02, 1.6367e-01, 4.6821e-02,
1.6028e-02, 1.2493e-01, -1.3118e-01, 2.9104e-02, 1.9252e-01,
-1.4585e-01, 1.0813e-01, 7.3010e-02, 1.2094e-01, 2.9136e-02,
-1.2462e-01, -8.7038e-02, -9.2302e-02, -2.9830e-02, -1.5125e-02,
-5.9967e-02, 6.5575e-02, 6.6929e-02, -3.1152e-02, 2.6122e-01,
1.1855e-01, -6.6356e-02, -1.1969e-01, 7.6353e-02, 6.2910e-02,
1.8732e-01, 2.3751e-01, 1.2501e-01, 1.5296e-01, 1.7489e-02,
-2.8206e-02, 1.1940e-01, -1.7025e-02, 3.4523e-02, 2.3139e-02,
6.2874e-02, 5.7604e-03, 1.1266e-01, 1.0828e-01, -1.2063e-01,
1.6957e-01, 4.5460e-03, -1.9677e-02, -1.8736e-02, 1.3145e-01,
-9.8840e-02, 3.0137e-02, -6.2143e-02, 6.7086e-02, 8.0808e-02,
1.6178e-01, 4.5842e-02, -9.5969e-02, 2.1006e-01, 9.2914e-02,
9.4967e-02, 2.5004e-01, 9.5483e-03, 1.5119e-01, 1.6314e-01,
1.1977e-02, -1.1157e-01, 1.2317e-01, 1.1549e-01, 3.6469e-02,
2.1972e-01, 6.6304e-02, -2.2025e-01, 6.1853e-02, 8.9013e-02,
1.1424e-02, -2.8260e-01, 1.4574e-03, -5.4400e-03, -3.7646e-02,
8.9454e-03, -7.6040e-02, 3.6176e-02, 1.1077e-01, 4.1273e-02,
-1.3210e-01, 3.3829e-04, 1.5245e-01, 4.9899e-02, 7.2825e-02,
-1.4247e-01, -5.5207e-02, -1.3441e-01, 5.4201e-02, 6.4411e-02,
-5.3626e-02, 2.3186e-02, 1.1818e-01, 2.0110e-02, 1.6029e-01,
4.3654e-02, 2.0811e-01, -7.2397e-02, 4.6134e-02, 1.6096e-01,
8.9308e-02, 1.5639e-01, 9.2628e-02, 1.1626e-01, -6.0540e-03,
-3.8544e-02, 1.1184e-01, 1.1171e-01, -2.7789e-02, -4.0543e-02,
1.7484e-01, -1.1810e-01, 1.3193e-01, 1.4140e-01, -5.8980e-02,
2.8021e-03, 9.9694e-04, 6.2574e-02, 2.5506e-02, -3.2007e-02,
-4.0343e-02, 1.2737e-02, 1.2772e-01, 8.7114e-02, 5.5047e-02,
1.6305e-01, -3.4157e-02, -9.6103e-02, 7.0649e-02, 2.1227e-01,
-2.1461e-02, 2.5751e-01, 1.2815e-01, 1.6047e-01, 5.7591e-02,
2.6102e-01, 1.3121e-01, -7.3381e-02, 8.9716e-02, 6.5462e-02,
1.5307e-01, 5.0791e-02, -2.6012e-01, 5.1867e-02, 1.4094e-02,
-3.3079e-02, 6.2476e-02, 3.4540e-01, 1.1716e-01, 1.2684e-01,
1.8008e-02, -2.9029e-02, 4.5071e-02, 4.0850e-02, 3.2381e-02,
2.7586e-01, 2.1241e-01, -1.2160e-01, 5.5001e-02, 2.5246e-01,
-1.3320e-01, 3.4741e-01, 7.4133e-02, 6.2187e-02, -7.7039e-02,
-1.8373e-02, -5.9771e-02, 4.0363e-02, 3.0250e-01, -1.5204e-01,
4.6913e-02, 1.8593e-02, -5.1675e-02, 1.7085e-01, -9.4747e-03,
5.4208e-02, 1.0210e-01, 2.1106e-01, -5.4499e-02, 8.2601e-02,
-1.5773e+00, 2.2663e-02, 1.2610e-03, 3.0813e-02, 5.4990e-01,
-9.9591e-02, 1.7559e-02, -4.0167e-02, -1.5464e-03, 3.1513e-02,
-9.8943e-02, 1.6316e-01, 2.4608e-01, 2.6529e-02, 3.8808e-03,
-6.2330e-03, 2.5520e-01, -9.1991e-02, -3.3009e-02, 9.3044e-03,
-1.1345e-01, -5.2301e-03, 1.6046e-01, 2.1700e-01, -3.0408e-02,
-4.1163e-04, -1.4484e-03, 1.1401e-01, 4.6460e-02, 2.8586e-02,
2.1193e-02, 1.7275e-01, 2.2710e-01, 1.0662e-01, -1.2396e-01,
-4.0240e-02, 1.0468e-01, -1.7951e-01, 6.7376e-02, -1.6865e-01,
1.8620e-01, 1.4148e-01, 7.8239e-02, 1.2616e-01, 1.9640e-01,
6.2600e-02, 2.1160e-01, 1.2121e-01, -2.2998e-02, 2.2931e-02,
-7.6713e-02, -4.7297e-02, -3.3803e-02, -2.9889e-02, 2.1451e-01,
1.8461e-01, 4.2526e-02, 7.9753e-02, 1.8982e-01, 1.5785e-01,
-7.5004e-03, -3.1074e-02, 1.2518e-01, -1.2754e-01, 2.5969e-03,
2.8029e-01, 1.0266e-01, 9.2749e-02, 5.9568e-02, 1.4050e-02,
2.2228e-01, 9.7542e-02, 6.7391e-02, 9.2528e-03, 2.3900e-02,
-1.4302e-01, -1.4003e-01, -7.4578e-02, -9.9026e-02, -1.9474e-02,
1.0186e-02, 1.6239e-01, 1.1948e-01, 8.0622e-02, 1.4787e-01,
1.1912e-03, -1.6148e-01, -1.2720e-02, -6.8096e-02, 5.1103e-02,
2.1139e-01, 2.1120e-01, 2.6037e-01, 3.4305e-02, 1.1621e-01,
3.9874e-02, 3.7237e-02, 4.0166e-02, -2.0457e-01, 3.0017e-01,
8.6819e-02, 5.9210e-02, -1.3180e-01, 1.3587e-01, 1.9427e-01,
6.2965e-02, 1.3271e-01, 2.5411e-01, 5.7590e-02, 3.9058e-02,
-5.4877e-03, -2.5431e-01, -8.8730e-02, 1.4748e-01, 1.1433e-01,
-5.7770e-02, -4.0391e-02, 8.3335e-02, 1.6901e-03, 1.7750e-01,
-1.1073e-02, 8.9263e-02, 8.8326e-02, 1.2291e-01, 6.2488e-02,
3.3788e-03, 2.7177e-02, 1.6919e-02, -1.9585e-02, -3.3542e-02,
-9.7026e-02, 1.4874e-01, 2.2488e-02, 8.6565e-02, 1.8924e-01,
7.5621e-02, -3.9567e-02, 2.2072e-01, 2.7710e-02, 1.1393e-01,
1.0728e-01, 7.5923e-02, 1.2847e-01, 6.7884e-02, 5.9445e-02,
2.1635e-01, 6.7740e-02, 2.8179e-01, 3.9687e-02, 4.1103e-02,
8.9988e-02, 6.1003e-02, 1.5873e-02, 1.3422e-01, -3.4349e-02,
2.4286e-01, -1.3996e-01, -1.4952e-01, -1.8798e-02, 1.0137e-01,
1.9682e-01, -5.6941e-03, -6.5834e-01, 1.2193e-01, 7.4436e-04,
3.3915e-02, 1.2323e-01, 1.1130e-01, 1.7712e-01, 4.2854e-01,
1.1864e-01, -7.5223e-02, -3.1450e-02, -1.2405e-02, 1.2903e-01,
2.5370e-01, 1.1059e-01, 6.4234e-02, -1.4558e-01, -2.5316e-01,
5.6366e-02, -5.9950e-03, 2.6568e-02, -9.3554e-02, 3.9947e-02,
-1.6267e-01, -5.6109e-03, 1.3383e-01, 2.3148e-02, -1.5536e-01,
-1.7512e-02, 1.2669e-01, 1.7827e-01, -1.0802e-01, 2.8132e-01,
-5.5138e-02, 2.4168e-02, -1.7649e-01, 4.5543e-01, 1.0262e-02,
-6.3519e-02, -1.4750e-02, 4.4689e-01, -2.5658e-02, 1.2053e-01,
2.0096e-01, -1.4621e-03, 9.2406e-02, 2.1397e-01, 1.9116e-02,
-1.6326e-01, 9.1302e-02, 1.4694e-01, 2.3627e-01, 1.0947e-01,
1.2930e-01, 2.3819e-02, -1.2160e-01, 2.8709e-02, -5.8315e-02,
1.9408e-01, -1.9025e-02, -8.7857e-02, -8.4412e-05, -1.7575e-01,
1.4638e-01, 2.1518e-01, 1.1785e-01, -7.9600e-02, -2.1500e-01,
4.0315e-02, 3.9440e-02, 3.3495e-02, 1.3086e-01, 2.0442e-01,
-2.6266e-02, -2.2577e-02, 1.7763e-01, 2.7348e-01, 2.8227e-02,
3.7061e-01, 1.8519e-01, 2.6057e-01, -4.6835e-02, 8.1050e-02,
1.8757e-01, 1.4135e-01, 3.7065e-01, 1.5082e-01, 2.0617e-02,
1.0078e-01, -1.0372e-01, 2.9881e-02, 5.8271e-02, 1.5371e-01,
6.0351e-02, 2.5731e-02, 2.2701e-02, 8.1781e-02, 8.3951e-02,
3.1394e-02, 3.1476e-01, 9.5656e-02, 2.2179e-01, 1.7019e-01,
2.6165e-01, 3.5269e-02])), ('features.32.weight', tensor([[[[-1.3215e-02, -8.8657e-03, -1.4056e-02],
[-3.4552e-03, -1.3000e-02, -1.3764e-02],
[-5.1530e-03, -4.7187e-03, -7.5421e-03]],
[[ 1.5965e-02, 9.7246e-03, 1.3433e-03],
[-6.3261e-03, -3.5627e-03, -5.7481e-03],
[-1.7625e-02, -1.8753e-02, -1.1822e-02]],
[[ 1.1398e-02, 1.7889e-02, 2.0608e-02],
[ 8.7286e-03, 1.0586e-02, 5.6685e-03],
[-1.3305e-02, -3.1666e-03, -5.9130e-03]],
...,
[[ 5.0928e-03, -4.8563e-03, -1.4327e-02],
[-4.7565e-03, -3.0794e-03, -1.2118e-02],
[ 1.2912e-03, 2.3905e-03, -1.4804e-02]],
[[ 2.9734e-02, 3.1346e-02, 2.0905e-02],
[ 1.6181e-02, 1.7316e-02, -2.5040e-03],
[-8.5055e-04, -7.9272e-03, -2.6109e-02]],
[[ 1.6051e-02, 1.5383e-02, 1.6497e-02],
[ 1.9919e-03, 1.8143e-03, 6.1791e-04],
[ 1.0633e-03, -4.9262e-03, -1.4220e-02]]],
[[[-3.3777e-02, -3.0592e-02, -2.4381e-02],
[-1.7067e-02, -3.4865e-02, -2.9977e-02],
[-2.6543e-02, -3.0357e-02, -2.3452e-02]],
[[ 1.2414e-02, 5.0093e-03, 6.2861e-03],
[ 2.1940e-02, 1.0647e-02, -2.2234e-03],
[ 3.3833e-02, 1.1974e-03, 1.3278e-02]],
[[ 1.5937e-02, 1.1262e-02, 3.4076e-03],
[-5.2947e-04, -1.5593e-03, -1.2187e-03],
[-5.6768e-03, -5.9800e-03, 8.6886e-05]],
...,
[[ 1.8186e-02, 2.7837e-02, 6.2656e-03],
[ 1.8460e-02, 3.3252e-02, 9.4248e-03],
[ 9.3330e-03, 1.3712e-02, 1.4209e-02]],
[[ 1.2283e-03, -3.6856e-03, -1.4633e-02],
[ 4.2311e-03, -8.0701e-03, -1.3590e-02],
[ 1.6259e-03, 5.3011e-03, -8.6232e-03]],
[[ 2.4723e-02, 2.4045e-02, 8.6670e-03],
[ 5.3259e-03, -6.7104e-03, -1.2907e-02],
[-1.2546e-02, -5.4137e-03, -4.1501e-03]]],
[[[ 2.3214e-02, 1.9638e-02, 2.2006e-02],
[ 1.3859e-02, 5.5883e-03, -1.4242e-03],
[-8.7998e-03, -8.6840e-03, -9.3636e-03]],
[[ 7.2790e-03, -2.5067e-03, 1.9002e-03],
[ 3.3033e-03, 8.8504e-03, 5.9509e-03],
[ 2.3135e-03, -8.3384e-03, -7.6990e-04]],
[[-5.4312e-03, -1.4004e-02, -1.5800e-02],
[-8.5627e-03, -1.3189e-02, -7.9316e-03],
[-1.3533e-03, -3.0567e-04, -2.6861e-03]],
...,
[[-1.5981e-02, -2.0143e-02, -1.9069e-02],
[ 2.7961e-03, 1.1302e-03, -6.1401e-05],
[ 4.2650e-04, -4.3043e-03, 9.1784e-03]],
[[ 9.9578e-03, 7.0426e-03, 5.6928e-03],
[ 9.1185e-03, 7.2648e-03, 1.2757e-02],
[ 6.5947e-03, 1.4270e-02, 1.2350e-02]],
[[ 2.5905e-02, 2.7557e-02, 2.5754e-02],
[ 1.4092e-02, 1.3402e-02, 1.4171e-02],
[-6.1019e-03, -2.2864e-03, 3.8483e-03]]],
...,
[[[-3.4185e-03, -1.6711e-03, -1.3429e-02],
[-4.7901e-03, -3.9603e-03, -8.4745e-03],
[-9.7596e-03, -1.8737e-02, -9.1477e-03]],
[[-1.7652e-02, 1.9627e-02, 2.6249e-02],
[-2.0960e-02, 1.8501e-02, 3.0466e-02],
[-2.3556e-02, 1.1362e-02, 3.8503e-02]],
[[-1.1822e-02, 2.5667e-03, 1.1159e-02],
[ 5.5238e-03, 3.0815e-03, 4.8975e-04],
[-3.9432e-03, -7.8219e-03, -8.6657e-03]],
...,
[[ 6.8907e-04, 2.1458e-02, 2.7847e-02],
[ 4.6602e-03, 1.5251e-02, 3.2405e-02],
[-1.1398e-02, -1.8792e-03, 1.1990e-02]],
[[-8.6421e-03, 2.1933e-03, 2.0424e-02],
[-1.7571e-02, -1.1312e-02, -3.1353e-03],
[-1.6624e-02, -1.1088e-02, -1.6200e-02]],
[[ 1.8900e-02, 5.0147e-02, 6.8750e-02],
[ 1.9742e-02, 3.2943e-02, 3.5836e-02],
[ 3.8052e-03, -1.9501e-02, -1.3861e-02]]],
[[[-2.0974e-02, -1.9251e-02, -1.3488e-02],
[-2.4446e-02, -1.5675e-02, -1.6319e-02],
[-1.5850e-02, -9.4786e-03, -3.5040e-03]],
[[-1.3794e-02, -1.3765e-02, 3.9522e-03],
[-2.2037e-02, -6.1584e-03, 1.9115e-02],
[-7.5293e-03, 1.0793e-02, 3.0944e-02]],
[[-9.9353e-04, -9.3036e-03, -1.5268e-03],
[-6.7407e-03, -5.7808e-03, -4.9502e-03],
[ 1.5912e-03, 6.4420e-03, 1.3924e-02]],
...,
[[ 1.8916e-02, -1.1328e-03, 7.0829e-03],
[ 1.8693e-02, -6.0499e-03, -1.6279e-03],
[ 1.7629e-02, -3.3821e-03, 5.6262e-03]],
[[ 1.7350e-02, 1.8018e-02, 1.8908e-02],
[ 1.3533e-02, 1.7687e-02, 1.5631e-02],
[ 1.2995e-02, 1.4435e-02, 4.3738e-03]],
[[ 2.0608e-03, 2.3016e-03, 1.6701e-02],
[ 8.9243e-03, 2.6536e-02, 2.4665e-02],
[ 9.2546e-03, 3.1449e-02, 3.3058e-02]]],
[[[-2.9341e-03, 2.3743e-03, 1.8532e-03],
[ 1.3530e-02, 7.3728e-04, -8.2628e-04],
[-2.4918e-03, -6.5893e-03, -1.0776e-02]],
[[ 1.0635e-02, 1.0032e-03, 4.5331e-03],
[-3.8554e-03, -3.7089e-03, -3.4078e-03],
[-1.2055e-03, -4.6014e-03, -7.2101e-04]],
[[ 1.1237e-02, 6.8281e-03, 1.2798e-02],
[-7.9842e-03, -9.7046e-03, -1.3426e-03],
[-1.6079e-02, -2.8943e-03, -8.7750e-04]],
...,
[[-2.5826e-03, 2.2775e-03, 2.6194e-03],
[-6.7322e-04, 8.2555e-03, -4.8549e-03],
[-1.0424e-02, -1.1143e-03, -1.4526e-02]],
[[-2.0618e-02, -3.1320e-02, -2.3285e-02],
[-1.1724e-02, -3.0647e-02, -2.3737e-02],
[-4.9147e-03, -1.8180e-02, -1.3703e-02]],
[[-1.9046e-02, -2.1640e-02, -2.0963e-02],
[ 4.3969e-03, 8.2986e-03, 2.0082e-03],
[ 1.9044e-02, 2.1049e-02, 6.9700e-03]]]])), ('features.32.bias', tensor([-2.1400e-02, 6.4466e-02, 2.7398e-02, -2.3644e-01, -1.8460e-02,
1.5797e-01, 6.7716e-02, 9.1922e-02, 1.0915e-01, 9.3645e-02,
3.5491e-01, 1.0766e-01, 1.9814e-01, 5.3942e-02, -2.6415e-02,
2.4894e-01, -4.6560e-02, 9.2074e-02, 1.4654e-01, 4.5373e-02,
1.5133e-01, 1.3720e-01, 1.8723e-01, 8.3070e-03, 2.2957e-01,
-1.8089e-02, 1.3806e-01, 3.9745e-02, 1.6090e-01, -9.5043e-03,
4.2779e-01, 2.1493e-01, 3.4236e-02, 7.7134e-02, 4.4619e-02,
-1.7547e-01, -1.8777e-01, 6.6511e-02, 3.6353e-02, -1.0416e-01,
1.5615e-01, 1.0418e-01, 1.4269e-01, 3.7591e-03, -1.4491e-01,
1.9662e-01, -1.0009e-01, 1.7664e-01, -3.2554e-01, 1.1134e-01,
1.5603e-01, -9.8756e-02, -1.0469e-01, 1.4183e-01, 9.9085e-02,
1.0956e-01, 3.6883e-02, 9.1448e-02, 2.6367e-01, -1.7308e-01,
1.3565e-01, 8.4406e-02, 9.5404e-02, -8.1696e-03, 7.4023e-02,
1.5638e-01, -4.2472e-02, 8.3594e-02, 2.1316e-01, 2.0749e-01,
1.0009e-01, 1.2299e-01, 5.1167e-03, 3.5469e-02, -2.1651e-02,
-2.0776e-01, -2.0217e-01, 1.5256e-01, 9.0442e-02, 7.9245e-02,
2.9300e-02, 3.8015e-02, -4.9863e-03, 1.6170e-02, -8.8340e-02,
9.6070e-02, -3.3962e-03, 9.1207e-03, 6.9368e-02, 1.3890e-01,
-2.5186e-01, -4.9176e-02, 1.1026e-01, 1.8955e-01, 3.9475e-02,
-4.5339e-02, 6.5043e-02, 1.0749e-01, -3.6584e-02, -1.6119e-01,
-1.5193e-01, 8.6962e-02, 3.7995e-02, 6.5481e-02, -6.3130e-02,
7.8405e-02, -2.0463e-02, -1.3130e-01, 3.2544e-01, 2.4361e-01,
1.3566e-01, 1.0206e-01, 6.9018e-02, -2.5383e-01, 5.9114e-02,
-2.0350e-01, -8.6283e-02, 7.8464e-02, -2.0801e-01, 1.2315e-01,
4.6803e-02, -1.4775e-01, 1.0057e-01, -4.4456e-01, 1.9236e-01,
8.2201e-02, 2.1275e-02, 6.9460e-02, 2.5342e-01, 4.2900e-03,
1.5559e-01, 6.8660e-02, 1.3778e-01, 1.5330e-01, 5.9765e-02,
1.9905e-01, 1.0599e-01, 3.0450e-01, 4.5967e-02, -8.7690e-02,
1.3532e-01, 1.2221e-01, 1.0564e-01, 1.9301e-03, 1.5271e-01,
1.0924e-01, 4.4578e-02, 1.3719e-01, 2.1381e-01, 2.1111e-01,
1.5419e-01, 3.6693e-02, 1.5621e-01, -7.8368e-01, -3.0199e-02,
1.0946e-01, 9.7143e-02, 5.1490e-02, 1.4472e-01, 4.0486e-02,
2.0866e-02, -8.5780e-03, 6.0089e-02, 3.7433e-02, 7.3121e-02,
-1.4780e-02, 1.6029e-01, 2.1558e-01, -6.5300e-04, 2.4402e-03,
1.9980e-01, 1.8452e-01, 2.4950e-01, 1.6289e-01, 2.1371e-01,
-9.0707e-02, -1.7292e-01, 1.5191e-01, 4.0297e-02, -7.4009e-02,
7.1272e-02, 1.0191e-01, 9.9330e-02, 8.9394e-02, 1.5560e-01,
5.2668e-02, -2.6318e-01, 1.1445e-01, 2.5771e-02, -1.0943e-01,
8.1563e-02, -7.0619e-02, 3.2280e-02, 1.3230e-01, -3.3911e-02,
2.8768e-02, 2.0835e-01, 1.8304e-01, 2.5722e-02, 1.5274e-01,
1.5639e-01, 1.8849e-01, 7.8892e-02, 1.3826e-01, -1.1088e-01,
8.3685e-02, 2.9810e-01, -1.0306e-01, 5.8601e-02, -3.0726e-01,
4.7779e-02, 1.5379e-02, 2.1211e-01, 2.4961e-01, 5.1293e-02,
1.2071e-01, 1.9411e-01, -6.1301e-02, -3.4114e-02, 1.9589e-01,
1.4726e-01, 1.1602e-01, -9.9377e-02, 5.7126e-02, 8.9970e-02,
1.4044e-01, -1.0527e+00, 1.5276e-01, 6.7668e-02, -2.8553e-02,
-1.6632e-02, 5.0097e-02, 1.6276e-01, 1.9631e-01, 1.6047e-01,
2.3464e-03, 8.2041e-02, 1.8940e-01, 1.2256e-03, -4.6279e-02,
1.4862e-01, -6.9743e-02, 9.4408e-02, 4.6572e-02, 1.4200e-02,
1.3699e-01, -2.0592e-02, 1.1924e-01, -1.8249e-03, 1.3407e-01,
1.8493e-01, 4.0543e-02, 4.3005e-02, 2.9145e-02, 3.0789e-01,
6.1579e-02, -1.2240e-01, 8.6646e-02, -4.0658e-02, 1.6675e-02,
2.4495e-01, 7.3838e-03, 1.4387e-01, -1.3057e-02, 1.2720e-01,
-4.1627e-02, 6.8986e-02, -2.5368e-03, -1.3614e-01, 9.7507e-02,
1.1291e-01, -1.1472e-02, 1.0550e-01, 1.3433e-01, 9.9580e-02,
1.8150e-01, 5.6307e-02, -5.6649e-02, 2.6789e-02, -2.8932e-01,
-9.3377e-03, 9.3770e-02, 6.7888e-02, -1.8243e-02, 1.0069e-01,
-1.7715e-01, 3.6905e-02, 6.8069e-02, -2.2617e-01, 1.0159e-01,
1.2387e-01, 2.0252e-01, 1.8433e-01, -7.2571e-03, 8.6806e-02,
-7.2950e-02, -2.7135e-02, 1.9177e-01, 1.0006e-02, -2.9306e-02,
-4.3151e-02, 1.3201e-01, 6.2068e-02, 1.5686e-01, -2.4020e-02,
-1.8447e-01, 2.9657e-01, 3.5428e-02, -4.5995e-02, -1.7264e-02,
-7.8411e-03, 1.0647e-01, -1.0917e-01, 5.5614e-02, 6.4051e-02,
-2.6314e-01, 1.6430e-01, 1.4638e-01, -3.8392e-01, -4.7981e-01,
1.4902e-01, 2.3113e-01, 2.9551e-03, 1.4113e-01, 8.3302e-02,
-8.0853e-02, 1.9808e-01, -1.7895e-01, 6.9983e-02, -6.3245e-02,
-1.1012e-01, -2.6038e-04, -2.5424e-03, 1.4419e-01, 1.3537e-01,
3.0137e-02, -3.6075e-02, 1.9969e-02, -1.0845e-02, 2.8108e-01,
9.7921e-02, -3.5578e-02, 8.7234e-02, -1.0327e-02, 7.2221e-02,
1.1079e-01, -9.4369e-02, 2.2449e-02, -3.5661e-02, -1.8254e-01,
5.0316e-02, 1.7312e-01, 4.6950e-02, 2.6093e-01, 7.1057e-02,
-4.6828e-03, 1.1635e-01, 7.8081e-02, 2.1343e-01, 1.1717e-01,
5.3786e-02, 1.4449e-01, 3.1620e-02, 4.8245e-02, 1.6988e-01,
-4.3482e-02, 2.2858e-01, 3.3062e-02, 2.2071e-01, -1.2407e-01,
3.7412e-02, 7.9971e-02, 5.3531e-02, -8.7242e-03, 1.5654e-01,
1.0841e-01, 3.1420e-01, 9.3297e-02, -1.3056e-01, 1.6351e-01,
7.5210e-02, 1.1486e-02, 6.5212e-02, 7.6294e-02, 2.0303e-01,
-6.1739e-02, 5.6558e-02, 2.7781e-01, 1.2212e-01, -2.0712e-01,
6.2284e-02, 1.7409e-01, -2.3294e-02, 4.6277e-02, 1.4508e-02,
-2.2541e-02, 9.3317e-02, 7.6720e-02, 3.9168e-03, 2.2876e-02,
2.3421e-01, -2.8504e-01, 1.6816e-01, 2.2501e-01, 7.4016e-02,
7.6737e-02, 3.7215e-02, -1.6071e-01, 8.8854e-02, 1.9094e-01,
-2.1007e-01, 2.0666e-01, 2.7319e-02, 1.4973e-01, 4.2642e-02,
-1.2370e-01, -2.5921e-03, 4.8596e-02, 2.1111e-02, 9.0651e-02,
-3.0732e-02, 2.1214e-01, 5.7203e-02, 1.5191e-01, -1.4875e-01,
2.3112e-02, 3.4691e-02, -1.2370e-01, 1.6613e-03, 9.1538e-02,
8.2667e-02, 1.2580e-01, 2.6245e-01, 1.3723e-01, -8.0229e-02,
9.3882e-03, -1.1930e-02, 1.0680e-02, -1.8429e-01, -7.5867e-02,
2.0018e-01, 1.8726e-01, 6.8121e-02, 1.7527e-01, 1.3280e-01,
7.1367e-02, 1.2183e-01, 1.1108e-01, -3.4206e-01, 1.0728e-01,
3.3519e-02, 1.5008e-01, -6.6165e-01, 9.7632e-02, 6.7827e-02,
3.3912e-01, 5.3267e-02, 1.1402e-01, -1.0586e-01, 8.9717e-02,
-3.8957e-02, 8.6706e-02, 1.1641e-01, 3.6441e-02, 2.3826e-01,
-4.5166e-04, -2.0848e-01, 3.0176e-01, -4.2135e-02, 1.0892e-01,
1.6205e-01, 1.4473e-02, 7.8987e-02, 7.8335e-02, 6.2553e-02,
7.0565e-03, 9.4852e-02, -1.2491e-01, 1.0971e-01, -2.5085e-01,
-6.5962e-02, 7.7836e-02, 2.8011e-02, 2.0072e-01, -2.3824e-01,
1.5842e-01, 1.3277e-01, 2.1269e-01, 5.7652e-02, 9.6813e-03,
4.0607e-02, 6.3522e-02, -4.3966e-02, 3.5011e-02, 1.0274e-01,
1.3122e-01, 4.7326e-02, 6.7550e-02, -7.9718e-03, 2.4616e-01,
1.9916e-01, 3.8532e-02, -1.3770e-01, 1.9973e-02, 5.4767e-02,
-3.8308e-02, 7.0574e-02, 1.5138e-01, 4.4556e-02, 8.8556e-02,
1.7265e-01, 3.7226e-02])), ('features.34.weight', tensor([[[[ 6.4057e-03, -5.6104e-03, -6.3782e-03],
[ 5.5522e-03, -5.2030e-03, -1.2359e-02],
[ 5.8233e-03, 4.2696e-03, -6.3149e-03]],
[[ 2.0158e-02, -5.7563e-04, -9.4625e-03],
[ 1.7941e-02, -2.8903e-03, -2.7995e-03],
[ 4.7047e-03, -1.1821e-02, -8.9368e-03]],
[[ 2.3352e-05, -4.3150e-03, -8.3558e-03],
[-1.9588e-02, -3.0065e-04, 1.1444e-04],
[-2.7658e-02, -9.0969e-03, -2.4535e-03]],
...,
[[ 2.7049e-02, 1.9788e-02, -1.6020e-03],
[ 2.4772e-02, 2.0182e-02, -5.1620e-03],
[ 8.7211e-03, 5.2141e-03, -7.9942e-04]],
[[-1.8375e-02, -4.3759e-03, 2.8905e-03],
[-1.6004e-02, -4.9442e-03, -1.9713e-02],
[-2.7843e-02, -1.3940e-02, -1.2462e-02]],
[[ 9.2587e-03, -5.0584e-04, -4.9280e-03],
[-4.0497e-03, -4.7096e-03, -1.3803e-02],
[ 6.6668e-03, 4.3470e-03, -1.1463e-02]]],
[[[ 2.5654e-02, 3.3639e-02, 2.4293e-02],
[ 1.5495e-02, 5.6533e-03, 1.2412e-02],
[-1.1672e-02, -1.6189e-02, -2.3941e-02]],
[[-1.3089e-02, -9.7347e-03, -7.8538e-03],
[-2.5446e-02, -1.7887e-02, -1.5814e-02],
[-1.8141e-02, -2.8729e-02, -1.7062e-02]],
[[-8.3563e-03, 1.4734e-03, -1.7811e-02],
[ 2.1165e-03, 1.6519e-03, -2.1347e-03],
[-8.4318e-04, -9.9821e-03, -6.8204e-03]],
...,
[[-1.3063e-02, -2.3060e-02, -1.8300e-03],
[-1.2790e-02, -9.3859e-03, -1.7136e-02],
[-7.1311e-03, -8.0349e-03, 2.9175e-04]],
[[-1.3673e-02, -1.1272e-03, -1.2093e-02],
[ 9.5975e-03, 1.3250e-02, 1.7962e-03],
[ 2.6597e-02, 1.4862e-02, -4.7232e-03]],
[[-1.2095e-02, -1.0988e-02, -1.1099e-02],
[ 3.7043e-03, -8.4039e-03, -1.3576e-02],
[ 6.0661e-03, 7.6004e-04, 8.8895e-03]]],
[[[ 1.6408e-02, 1.0830e-02, 1.3853e-02],
[ 8.0672e-03, 4.3926e-03, 4.3400e-03],
[ 1.6033e-02, 9.2083e-03, 1.3214e-02]],
[[-6.3654e-03, -1.0900e-02, -6.8495e-03],
[ 1.1228e-03, -1.6826e-03, 1.1877e-02],
[ 6.8090e-03, 3.8111e-03, 1.7551e-02]],
[[ 5.8382e-03, -5.3694e-03, 3.8155e-04],
[-4.0764e-03, -1.2377e-02, -1.2680e-03],
[-1.4897e-02, -8.0396e-03, -7.9242e-03]],
...,
[[-3.7190e-03, 1.0956e-02, 4.8894e-03],
[-3.3382e-03, 3.1200e-03, -7.7998e-03],
[-1.1507e-03, -4.0118e-03, -1.4297e-02]],
[[-1.4915e-02, -7.8802e-03, -1.0914e-02],
[-1.2584e-02, -1.2366e-02, -2.0778e-02],
[-3.6117e-02, -2.7396e-02, -2.2059e-02]],
[[-1.2245e-03, 1.6722e-03, -5.4078e-03],
[ 1.0781e-02, 1.9132e-03, 7.8136e-03],
[ 1.7147e-02, 3.2246e-02, 2.3988e-02]]],
...,
[[[ 2.7930e-03, 5.7772e-03, 7.1997e-04],
[-4.9809e-03, 5.0973e-04, -6.0981e-03],
[ 8.6258e-03, -2.6214e-03, -1.0847e-02]],
[[ 2.7985e-02, 3.0507e-02, 9.3137e-03],
[ 2.1632e-02, 1.6707e-02, -3.8403e-04],
[ 1.2096e-02, -3.9432e-05, -1.7210e-02]],
[[-1.2718e-02, -1.3893e-02, -8.1566e-03],
[-8.9290e-03, -1.3556e-02, -1.4134e-02],
[ 3.6138e-03, -7.6041e-03, -5.2208e-03]],
...,
[[-1.1939e-02, -2.1789e-02, -2.3479e-02],
[-3.9319e-03, -7.0435e-03, -1.4048e-02],
[-1.3277e-02, -2.2959e-02, -1.7172e-02]],
[[-9.0374e-03, -1.3154e-02, -6.0878e-03],
[-4.9122e-03, -2.6857e-03, 5.9814e-03],
[ 2.4562e-03, 4.9683e-03, 8.4914e-03]],
[[ 1.7705e-02, -6.7179e-03, -3.0165e-03],
[ 1.0017e-02, -6.9119e-03, -2.6544e-03],
[ 1.7440e-03, -8.2669e-03, -4.0126e-03]]],
[[[-3.0995e-03, -9.8687e-03, -1.4384e-02],
[-6.6103e-03, -1.0893e-02, -7.7181e-03],
[-1.0454e-02, -1.4269e-02, -7.1590e-03]],
[[-1.2614e-02, -1.5302e-02, -1.6946e-02],
[-5.0727e-04, -9.0803e-03, -1.5739e-02],
[ 3.4643e-03, -2.9317e-03, -1.0745e-02]],
[[-2.4278e-03, -3.7237e-03, 8.4396e-04],
[-1.2583e-02, -6.1830e-03, -8.2366e-03],
[-3.1850e-03, -9.6901e-03, -1.3596e-02]],
...,
[[ 3.3286e-03, -6.5561e-03, -2.4367e-02],
[ 5.3881e-03, -5.7209e-03, -1.3674e-02],
[ 1.3441e-03, -3.3200e-03, -2.3920e-02]],
[[-3.3254e-02, -2.1532e-02, -6.4711e-03],
[-3.0650e-02, -3.1585e-02, -3.1640e-03],
[-2.0990e-02, -2.3468e-02, 4.4798e-03]],
[[ 9.0552e-03, 2.1114e-02, 2.7641e-02],
[ 2.1470e-02, 2.4750e-02, 3.0267e-02],
[ 1.5619e-02, 1.8009e-02, 1.8966e-02]]],
[[[-2.9261e-03, -1.8285e-02, -2.4177e-02],
[-7.5416e-03, -1.0634e-02, -1.5787e-02],
[-1.2850e-02, -1.3916e-02, -8.1989e-03]],
[[ 5.2934e-03, -1.2387e-03, -1.4914e-03],
[ 6.7873e-03, 1.0600e-03, -5.4019e-04],
[-6.2149e-04, -3.1924e-03, -1.2121e-02]],
[[-1.1953e-02, -6.6872e-03, -3.6850e-03],
[-8.2929e-03, -6.9736e-03, -1.2173e-02],
[-1.3873e-02, -1.7633e-02, -1.5311e-02]],
...,
[[ 9.4961e-03, 2.6093e-04, -2.6559e-03],
[-5.2116e-03, -6.8281e-03, -2.6961e-02],
[-3.0368e-02, -2.6230e-02, -3.0871e-02]],
[[ 9.3769e-04, 1.8260e-02, 1.1845e-02],
[ 1.8793e-02, 1.3872e-02, 9.8284e-03],
[ 2.0169e-02, 1.4034e-02, 1.2672e-02]],
[[-8.9876e-03, 1.2083e-02, 1.8627e-02],
[ 6.1051e-05, 2.3079e-02, 1.6259e-02],
[ 3.0437e-03, 1.1881e-02, 2.2811e-02]]]])), ('features.34.bias', tensor([ 1.5731e-01, -6.5533e-03, 5.1913e-02, -2.6417e-02, -5.9737e-02,
3.5818e-01, 1.9326e-02, -2.4400e-02, -1.0374e+00, -8.4517e-02,
7.3283e-02, 5.0524e-02, -3.8585e-02, 8.4795e-02, 1.5776e-02,
9.3237e-02, 1.9244e-02, -1.4082e-02, 4.6519e-01, 3.2535e-01,
7.3506e-02, 1.3250e-01, -8.8001e-03, 6.1042e-02, 4.7158e-02,
-6.7527e-02, 8.0073e-02, 9.2457e-02, 1.9967e-01, 6.4213e-02,
-1.2435e-01, 7.1232e-02, 1.2389e-01, 1.7530e-01, 1.0521e-01,
-2.0008e-02, 2.0486e-01, 6.8535e-02, -1.0005e-01, -8.9199e-03,
1.6148e-01, 1.6665e-01, 2.0168e-01, 3.1764e-01, 1.1871e-01,
1.5854e-01, 2.1661e-01, -2.3195e-01, -2.1480e-02, 5.5559e-03,
4.4063e-02, 1.2309e-01, -6.3932e-03, 6.6540e-02, 8.3576e-02,
1.4859e-01, 2.0367e-01, -2.6180e-02, 1.7393e-02, -6.9620e-03,
1.8881e-02, 1.1513e-01, 1.0660e-01, -2.2330e-01, 2.4035e-01,
7.4667e-02, -9.0826e-03, -1.2120e-02, -3.1178e-02, 7.3471e-02,
2.3673e-02, 7.6079e-02, 5.5273e-02, 1.0370e-02, -1.9606e-02,
7.9228e-02, 1.8544e-01, 3.6043e-02, 5.8925e-02, -1.3842e-01,
1.8106e-01, -2.1269e-01, 1.8151e-02, 7.0571e-02, 5.0748e-02,
4.1780e-02, 8.9805e-02, -1.7349e-01, 1.7001e-01, -5.6894e-02,
-3.3280e-02, 1.4353e-01, 8.9018e-02, 1.6304e-01, -4.7968e-02,
-9.5809e-02, -8.2146e-02, -1.0167e-01, 1.4996e-01, 8.0715e-02,
-2.3411e-02, 1.1056e-02, 1.1483e-01, 6.4205e-02, -2.1903e-03,
-9.8520e-02, -8.0127e-02, -8.5893e-04, -6.2755e-03, 4.3044e-01,
-1.4689e-02, 7.9148e-02, 2.0615e-03, 3.8441e-02, 2.1224e-02,
1.2911e-01, -1.7722e-02, 2.5068e-01, 9.4915e-02, 6.2045e-02,
-2.2997e-03, -5.0952e-04, 4.3427e-02, 1.5014e-01, 1.2273e-02,
1.5017e-01, 1.4325e-01, 1.0009e-01, 5.6689e-02, 3.2139e-02,
2.5951e-01, 6.7063e-02, -3.6047e-02, 1.6486e-02, -1.8912e-02,
6.0444e-02, 1.0571e+00, 1.6423e-01, 1.5721e-01, -1.7805e-02,
1.3294e-01, 1.3719e-01, 6.1718e-03, -3.4055e-02, 9.8862e-02,
-2.5892e-02, 1.9548e-02, 2.1528e-01, -3.7502e-02, 4.3931e-02,
2.8239e-01, 1.5612e-01, 3.8641e-02, 2.4527e-01, 1.1263e-01,
-2.1130e-02, 6.1509e-01, 1.7865e-01, 1.0873e-01, 1.9360e-01,
-7.4126e-02, -1.4466e-03, 4.9994e-03, 3.3460e-02, -1.7950e-02,
-1.1090e-01, 1.7558e-01, -1.0951e-02, -4.3409e-02, -1.1036e-01,
4.5788e-02, -1.8001e-01, -1.8189e-02, 1.6233e-01, 3.6884e-01,
1.5815e-01, 5.6507e-02, 9.8894e-02, 9.3949e-02, 3.4091e-01,
3.9113e-02, 2.3251e-04, -6.2403e-02, 3.1302e-02, 2.0869e-01,
1.3763e-01, 4.9091e-02, 3.6735e-02, 1.6306e-01, 4.6837e-02,
6.8301e-03, 2.4630e-01, 1.9119e-01, -8.4259e-02, 2.0576e-02,
6.6113e-02, 1.2664e-01, 7.5991e-02, 7.8335e-03, 2.3461e-01,
6.5672e-02, 1.9104e-01, 4.1648e-02, 1.3461e-01, 9.4458e-02,
1.1815e-01, 6.8200e-02, 4.4764e-02, 1.3333e-01, 1.9723e-02,
1.5108e-01, 1.6163e-01, -1.8736e-02, 2.9873e-01, 8.8800e-03,
3.5709e-02, 1.7235e-01, -4.5143e-02, 6.9958e-03, 1.0380e-01,
7.9205e-02, -2.2476e-02, -7.1872e-03, 5.1648e-02, 1.0196e-01,
-8.1404e-02, 7.8055e-02, 6.7653e-02, 2.6210e-02, 7.5721e-02,
7.5571e-02, 6.9311e-02, 1.5086e-01, 5.8288e-02, 1.7733e-02,
1.2168e-01, 1.4138e-02, 1.3053e-01, 2.3039e-01, -7.8017e-02,
2.4432e-01, 6.2879e-02, 4.9652e-02, 7.8438e-02, 8.9971e-02,
7.7896e-02, 1.1069e-01, 2.6848e-02, -3.7783e-03, 2.3438e-02,
1.8842e-02, 6.3897e-02, 2.1452e-01, -9.6788e-02, 1.0727e-01,
-1.2845e-01, 5.5943e-02, 2.4673e-01, -4.6080e-02, 1.2492e-01,
1.1530e-01, -2.2196e-02, 1.0459e-01, -3.8180e-02, 1.6522e-01,
1.2863e-01, 2.6662e-02, 8.7605e-02, -6.9907e-02, 6.0133e-02,
-2.0503e-02, 5.4154e-02, 1.0821e-01, -2.2444e-02, 7.3504e-02,
-2.7724e-02, 4.9019e-02, 1.4634e-01, 6.7829e-02, 3.7040e-02,
3.4126e-02, -3.2138e-02, -2.1871e-03, 5.7881e-02, -5.7500e-03,
4.9774e-02, 1.2786e-01, 5.2287e-01, 1.1435e-01, -3.4344e-02,
1.0515e-02, 5.7503e-02, 1.8660e-01, 4.9483e-02, -2.3514e-02,
9.1254e-02, -1.4827e-02, 1.0414e-01, 1.8446e-01, 1.1154e-01,
-5.3481e-02, 6.6921e-02, -4.2530e-02, 3.2835e-02, -2.4325e-02,
2.9274e-02, -4.7878e-02, 2.2815e-01, 2.7694e-01, 5.9355e-02,
8.1878e-03, 6.2357e-02, 3.2339e-01, -3.6187e-02, 3.6073e-02,
-4.2983e-02, -7.8441e-02, 1.4662e-02, -1.0895e-01, -5.6419e-03,
8.0781e-02, 6.4320e-02, 2.3453e-01, 2.0020e-01, 3.9262e-02,
1.2397e-01, 5.1584e-02, 1.8723e-01, 1.8693e-01, 1.8420e-02,
2.1001e-03, 8.0270e-02, 2.8388e-01, 1.2010e-02, 8.7825e-02,
1.9944e-02, 1.0565e-01, 2.4087e-01, -4.5402e-02, -8.9811e-02,
-1.1711e-02, 7.6836e-02, -3.5025e-01, 1.5891e-01, -2.6923e-01,
-2.8926e-02, 1.1089e-01, 1.4770e-02, -1.9990e-02, 1.8398e-01,
3.4169e-01, 7.9850e-03, 1.0817e-01, 4.6695e-02, 9.1436e-02,
-1.6279e-01, -2.4528e-01, 1.7095e-01, 7.0245e-02, 1.0925e-02,
3.3909e-02, 2.2430e-02, 3.7841e-02, 2.4725e-01, -4.8658e-03,
-3.0925e-01, 1.2960e-02, 7.1716e-02, -2.5388e-01, 1.0731e-01,
3.7324e-02, 2.1418e-01, -5.3999e-02, -1.3771e-02, -7.3585e-02,
-1.3414e-01, 5.0882e-02, -2.4663e-03, -2.2844e-02, 2.1686e-01,
1.8531e-01, -2.7044e-02, 1.4052e-01, 1.3167e-01, 4.1883e-02,
1.1929e-01, 6.2313e-02, 1.4854e-01, 2.2880e-01, 1.5444e-01,
1.8299e-01, -1.7180e-01, 1.2846e-01, 5.7027e-03, -2.6647e-01,
-1.4068e-01, 1.7644e-01, -4.9150e-02, 2.0700e-01, 1.3515e-01,
2.6716e-01, 9.1334e-03, 8.3232e-02, -3.7115e-02, 1.2940e-01,
1.5925e-01, -9.0140e-02, 1.1334e-01, -2.7049e-01, -4.8131e-02,
-6.3850e-02, 1.0742e-01, 2.3099e-01, 3.6651e-02, 1.1558e-01,
9.7689e-02, 1.5590e-01, 7.7404e-02, -1.0206e-02, 1.1674e-01,
1.2813e-01, 1.4275e-01, 1.7384e-01, 8.7933e-02, -3.7027e-02,
1.7883e-02, -2.7107e-02, 1.4345e-02, 6.7233e-02, -4.1910e-03,
3.4999e-02, 9.1231e-02, 4.9472e-02, -1.9353e-02, 1.7899e-01,
7.2613e-02, -3.5989e-03, -4.2476e-02, 1.4623e-01, 8.1129e-02,
8.8164e-02, 3.0212e-01, 2.2150e-01, 1.3933e-01, 1.2996e-01,
2.8800e-02, 1.9169e-01, -5.3428e-02, 2.0095e-03, 1.6060e-01,
-3.4886e-02, 2.3563e-01, 2.2469e-01, 1.2697e-01, 1.6958e-01,
9.5945e-02, 3.5644e-01, 2.9220e-02, 1.5525e-01, 3.5778e-01,
-2.2172e-02, -8.1075e-03, 4.1692e-02, -2.8637e-02, 2.1933e-03,
3.7724e-01, 3.5916e-02, -6.9374e-02, -1.1480e-01, -8.0757e-02,
9.4747e-06, 2.5585e-01, -3.9682e-02, 2.2938e-01, 2.5238e-02,
1.6486e-02, -4.6438e-02, 9.6950e-02, 6.5098e-02, 1.3750e-01,
-4.4991e-02, 2.4398e-02, 9.0758e-02, 7.2710e-02, 3.4996e-02,
1.2895e-01, -1.8546e-02, -1.1754e-01, -7.6734e-02, 1.2329e-01,
1.1917e-01, -1.3715e-01, -1.1645e-01, 7.0348e-02, -7.9591e-03,
1.3337e-01, 1.1553e-01, 7.9651e-02, 8.2783e-03, -7.9205e-02,
4.9982e-02, 3.3688e-02, 4.1475e-02, 9.8422e-02, 4.3647e-02,
1.9791e-01, 3.6862e-02, 4.6755e-02, -2.1724e-02, 1.1945e-03,
9.0638e-03, 2.6034e-02])), ('classifier.fc1.weight', tensor([[-8.8607e-03, -8.7695e-03, -1.0291e-02, ..., -4.3981e-03,
-5.7434e-03, -2.7040e-03],
[-3.3265e-03, -1.8518e-03, -2.8907e-03, ..., -2.3698e-04,
-8.7294e-03, -3.2204e-03],
[-2.1073e-04, -2.8196e-04, 1.1472e-02, ..., 6.1365e-03,
3.9799e-03, -1.0486e-03],
...,
[-3.0398e-03, -2.9182e-03, -9.8850e-03, ..., 1.0619e-03,
-5.4754e-03, 5.6359e-03],
[-8.2919e-03, -1.9754e-03, -2.7396e-04, ..., -6.2414e-03,
-4.5580e-03, -7.8612e-03],
[-9.1510e-03, -8.1838e-03, -1.2240e-02, ..., -4.2383e-05,
1.9226e-03, 2.9075e-03]])), ('classifier.fc1.bias', tensor([-0.0118, -0.0061, -0.0049, ..., -0.0043, -0.0029, -0.0035])), ('classifier.fc2.weight', tensor([[ 0.0117, 0.0085, 0.0018, ..., -0.0067, 0.0209, -0.0012],
[-0.0054, -0.0198, -0.0009, ..., -0.0114, -0.0150, 0.0056],
[ 0.0069, -0.0063, 0.0020, ..., 0.0175, -0.0194, -0.0077],
...,
[-0.0082, 0.0009, -0.0046, ..., -0.0060, -0.0007, -0.0199],
[ 0.0045, -0.0188, 0.0111, ..., 0.0090, -0.0009, -0.0130],
[ 0.0085, 0.0079, 0.0140, ..., -0.0086, 0.0158, 0.0142]])), ('classifier.fc2.bias', tensor([-0.0515, -0.0587, -0.0498, ..., -0.0404, -0.0362, -0.0845])), ('classifier.fc3.weight', tensor([[-0.0695, -0.0628, 0.0274, ..., -0.1678, 0.0076, -0.0275],
[ 0.0482, -0.0548, -0.0159, ..., -0.1036, -0.0107, -0.0308],
[-0.0821, -0.0839, 0.0505, ..., 0.0125, -0.0184, -0.0522],
...,
[ 0.0084, -0.1566, 0.0012, ..., -0.0251, -0.0145, 0.0136],
[-0.0413, -0.0422, -0.0773, ..., -0.1343, -0.0110, -0.0652],
[-0.1267, -0.1743, -0.0726, ..., -0.0558, -0.0266, 0.0308]])), ('classifier.fc3.bias', tensor([ 5.4974e-02, -4.8970e-02, 4.0842e-02, 1.8182e-02, -4.1770e-02,
-2.2386e-02, -9.5835e-03, -3.5134e-02, 2.1158e-02, 5.4724e-02,
2.0416e-02, 1.1544e-04, 3.2766e-03, -2.1450e-02, -4.3376e-02,
1.3029e-02, 5.0134e-02, 5.5092e-02, -2.9446e-02, 6.5906e-02,
-2.5199e-02, -2.2998e-02, -6.9072e-02, -6.1519e-02, -8.0379e-02,
5.6246e-02, -2.8316e-02, 1.2933e-02, 6.0010e-02, 6.3679e-03,
5.9497e-02, -8.4205e-02, -6.8897e-03, -6.7787e-02, 6.4115e-02,
3.4002e-02, 6.0686e-02, 9.5069e-02, -6.7079e-02, 6.5097e-02,
-4.0295e-03, -4.0887e-02, 9.6675e-02, -1.1161e-02, -8.0864e-02,
-6.3605e-02, -7.4894e-02, 1.3647e-02, 3.1313e-02, 1.1281e-01,
5.5832e-02, -2.3103e-02, -2.0763e-02, 1.0955e-01, -2.0621e-02,
2.9386e-02, -2.4630e-02, 3.8323e-02, -2.7304e-02, -1.4667e-02,
5.0558e-02, 6.3943e-02, -9.7684e-02, 1.1184e-02, 9.4412e-02,
-6.7809e-02, -2.7969e-02, -1.6082e-02, -4.3968e-02, -5.0671e-02,
-1.6837e-03, -8.7347e-02, 1.5257e-02, 3.4891e-02, 4.4593e-02,
-4.1282e-02, -1.2709e-02, -3.2626e-02, 4.3188e-02, -4.3837e-02,
-5.6668e-02, -4.7323e-02, -6.4052e-02, 4.7661e-02, -1.6499e-02,
4.5433e-02, -4.5472e-02, -1.9021e-02, 2.6243e-02, -2.4675e-02,
-1.7310e-02, -6.6584e-02, -2.0164e-02, -4.4208e-03, 3.6655e-03,
1.1863e-01, -5.3265e-02, 9.0935e-03, -6.3700e-03, 2.2094e-02,
3.0706e-02, -4.2949e-02]))]), 'optimizer': {'state': {0: {'step': 2050, 'exp_avg': tensor([[ 5.6052e-45, 5.6052e-45, 5.6052e-45, ..., 5.6052e-45,
5.6052e-45, 0.0000e+00],
[ 5.6052e-45, 5.6052e-45, 5.6052e-45, ..., 0.0000e+00,
5.6052e-45, 5.6052e-45],
[-5.6052e-45, -5.6052e-45, -5.6052e-45, ..., 0.0000e+00,
0.0000e+00, 5.6052e-45],
...,
[-5.6052e-45, 5.6052e-45, 5.6052e-45, ..., 0.0000e+00,
0.0000e+00, 0.0000e+00],
[ 5.6052e-45, 5.6052e-45, 5.6052e-45, ..., 0.0000e+00,
0.0000e+00, 5.6052e-45],
[ 5.6052e-45, 5.6052e-45, 5.6052e-45, ..., 0.0000e+00,
0.0000e+00, 0.0000e+00]], device='cuda:0'), 'exp_avg_sq': tensor([[1.4540e-10, 1.6802e-10, 4.2637e-11, ..., 2.3860e-14, 1.9108e-13,
0.0000e+00],
[3.0800e-10, 1.4925e-09, 1.3106e-09, ..., 0.0000e+00, 1.7375e-12,
1.6086e-12],
[3.2786e-11, 1.3495e-10, 5.8305e-11, ..., 0.0000e+00, 0.0000e+00,
4.3371e-11],
...,
[1.3670e-10, 9.9022e-11, 6.1383e-11, ..., 0.0000e+00, 0.0000e+00,
0.0000e+00],
[3.9491e-12, 3.0269e-11, 1.5469e-11, ..., 0.0000e+00, 0.0000e+00,
5.7265e-14],
[4.6684e-10, 1.3640e-09, 3.2345e-10, ..., 0.0000e+00, 0.0000e+00,
0.0000e+00]], device='cuda:0')}, 1: {'step': 2050, 'exp_avg': tensor([5.6052e-45, 5.6052e-45, 5.6052e-45, ..., 5.6052e-45, 5.6052e-45,
5.6052e-45], device='cuda:0'), 'exp_avg_sq': tensor([1.8469e-10, 2.1491e-09, 2.9593e-10, ..., 3.6040e-10, 1.5084e-10,
2.2984e-11], device='cuda:0')}, 2: {'step': 2050, 'exp_avg': tensor([[ 5.6052e-45, -5.6052e-45, 5.6052e-45, ..., -5.6052e-45,
-5.6052e-45, 5.6052e-45],
[ 5.6052e-45, 5.6052e-45, -5.6052e-45, ..., -5.6052e-45,
5.6052e-45, 5.6052e-45],
[-5.6052e-45, 5.6052e-45, 5.6052e-45, ..., 5.6052e-45,
5.6052e-45, -5.6052e-45],
...,
[ 5.6052e-45, 5.6052e-45, -5.6052e-45, ..., 5.6052e-45,
-5.6052e-45, 5.6052e-45],
[ 5.6052e-45, 5.6052e-45, -5.6052e-45, ..., -5.6052e-45,
-5.6052e-45, 5.6052e-45],
[-5.6052e-45, -5.6052e-45, -5.6052e-45, ..., 5.6052e-45,
-5.6052e-45, 0.0000e+00]], device='cuda:0'), 'exp_avg_sq': tensor([[4.2303e-09, 2.9733e-08, 4.7915e-10, ..., 2.7479e-09, 2.7931e-10,
3.7395e-12],
[5.4354e-11, 5.9952e-09, 6.1507e-11, ..., 2.3153e-10, 6.8133e-12,
4.2664e-12],
[3.8825e-09, 2.6447e-13, 1.2822e-10, ..., 5.5439e-09, 3.0804e-10,
4.8769e-12],
...,
[5.2370e-10, 5.4751e-09, 1.3301e-10, ..., 4.8094e-09, 4.1892e-11,
3.3543e-13],
[4.3981e-14, 1.4938e-10, 1.1184e-12, ..., 1.2102e-13, 3.3870e-11,
1.6934e-12],
[6.6965e-09, 4.6024e-09, 1.4123e-10, ..., 6.5773e-09, 4.4484e-13,
0.0000e+00]], device='cuda:0')}, 3: {'step': 2050, 'exp_avg': tensor([-8.6355e-06, -1.5398e-03, -1.0676e-04, ..., 6.7185e-04,
-3.2732e-31, 1.4426e-04], device='cuda:0'), 'exp_avg_sq': tensor([9.5186e-07, 5.5173e-06, 3.4785e-06, ..., 3.5266e-06, 1.2061e-09,
5.4514e-06], device='cuda:0')}, 4: {'step': 2050, 'exp_avg': tensor([[ 3.2807e-05, 1.0824e-03, 8.5617e-06, ..., 3.9247e-05,
3.1838e-42, -3.9215e-07],
[ 4.4519e-07, -1.0176e-02, 6.4234e-06, ..., 8.5723e-05,
2.7723e-36, 3.2498e-07],
[ 4.1279e-05, 9.2198e-05, 1.2068e-04, ..., 1.2233e-05,
4.7875e-34, 1.6447e-06],
...,
[ 1.8661e-05, 2.1748e-04, -2.9027e-05, ..., 9.7584e-05,
1.3170e-37, 7.5852e-06],
[ 1.2059e-06, 2.9422e-03, -4.6838e-05, ..., 1.4555e-05,
8.2926e-41, 3.9098e-04],
[ 6.5446e-07, 9.5472e-05, 5.2480e-05, ..., 6.0027e-06,
2.5400e-38, -1.6777e-02]], device='cuda:0'), 'exp_avg_sq': tensor([[1.1281e-05, 1.4138e-05, 1.6068e-04, ..., 1.7082e-08, 4.9477e-11,
1.2945e-05],
[3.8945e-05, 2.5724e-04, 8.2717e-05, ..., 2.1168e-06, 5.3626e-11,
1.4782e-05],
[8.2547e-07, 3.4282e-04, 8.0193e-04, ..., 1.7291e-04, 4.4183e-11,
1.5061e-04],
...,
[2.1583e-05, 1.9794e-06, 1.0987e-04, ..., 5.8445e-05, 1.8348e-07,
7.2770e-04],
[9.6312e-05, 2.1877e-05, 4.1244e-05, ..., 2.0523e-07, 7.0159e-11,
4.3517e-05],
[1.3439e-06, 2.8375e-06, 3.8778e-05, ..., 7.5252e-06, 5.0859e-11,
3.2737e-04]], device='cuda:0')}, 5: {'step': 2050, 'exp_avg': tensor([-2.8476e-03, -1.5687e-03, -1.2598e-03, -3.2673e-03, -2.1748e-04,
5.5843e-04, -3.6541e-03, 5.1827e-05, -5.3720e-04, 2.1369e-03,
-2.8126e-03, -2.8092e-04, -9.1686e-04, 1.3388e-05, -4.2834e-04,
-1.5074e-03, -4.0371e-03, 5.6169e-03, -3.0527e-03, 5.1223e-03,
-2.7207e-03, 2.2754e-03, 3.9534e-03, 2.2745e-03, -4.6943e-05,
4.2184e-03, -1.0366e-02, 2.7107e-03, -9.9207e-04, -2.9054e-03,
-2.5657e-04, 6.9963e-03, -2.1705e-03, -1.0175e-04, 7.5509e-04,
2.0431e-03, -2.2207e-04, 1.0255e-03, 3.4444e-03, -9.8886e-03,
4.6562e-03, -1.4991e-03, -4.5741e-03, -2.1555e-04, -3.4609e-04,
7.9900e-04, 1.3774e-04, -7.3379e-04, 1.9566e-03, 1.5251e-03,
2.6474e-03, 2.9298e-03, 4.8663e-04, 1.3794e-03, 7.6521e-05,
1.4934e-03, 3.4049e-04, -1.2474e-03, 1.2263e-03, 7.6448e-05,
1.3691e-03, 2.1954e-03, 5.8917e-04, -3.2394e-03, 3.5344e-03,
-4.1901e-04, 1.1543e-03, 3.6829e-03, -1.1880e-03, 4.5323e-05,
-1.7459e-03, 1.0779e-03, 9.5878e-04, 7.6265e-04, -4.5636e-03,
-3.2206e-04, -2.4102e-03, 3.6395e-03, -5.4391e-03, -1.1527e-04,
5.0707e-03, 1.2276e-03, -4.3471e-03, 1.9254e-03, -2.2847e-03,
3.7299e-03, 4.7963e-03, 2.3822e-03, 6.2641e-04, 9.6660e-03,
8.8450e-04, -1.1847e-03, 1.8348e-03, -8.9435e-04, -3.3354e-03,
-3.5307e-03, -3.5674e-03, -1.2635e-03, 1.0034e-03, -5.1744e-03,
-4.0091e-03, -1.3758e-03], device='cuda:0'), 'exp_avg_sq': tensor([9.5594e-05, 8.8613e-05, 9.8245e-05, 1.5179e-04, 6.3929e-05, 2.1328e-04,
1.0757e-04, 7.1778e-05, 1.0311e-04, 1.0288e-04, 1.1218e-04, 9.1954e-05,
1.7949e-04, 1.3334e-04, 5.6128e-05, 1.0192e-04, 7.2200e-05, 1.2544e-04,
7.5826e-05, 1.0721e-04, 7.0316e-05, 7.7432e-05, 4.1242e-05, 1.1703e-04,
8.4120e-05, 1.4827e-04, 1.6028e-04, 1.3894e-04, 1.2922e-04, 8.5106e-05,
9.1609e-05, 6.3572e-05, 1.4377e-04, 1.0513e-04, 1.1495e-04, 1.1922e-04,
1.7268e-04, 1.5768e-04, 1.9745e-04, 1.3714e-04, 2.7782e-04, 9.9004e-05,
1.1524e-04, 2.9417e-04, 4.7031e-05, 1.2083e-04, 5.9411e-05, 1.4530e-04,
1.1548e-04, 5.0299e-04, 1.3139e-04, 1.9425e-04, 6.4452e-05, 1.7564e-04,
9.3126e-05, 1.0344e-04, 1.0571e-04, 1.1806e-04, 3.6900e-05, 9.6611e-05,
7.7781e-05, 1.6589e-04, 6.9364e-05, 7.1141e-05, 1.7319e-04, 8.8678e-05,
1.0116e-04, 1.1348e-04, 8.6746e-05, 8.8252e-05, 7.7027e-05, 1.0246e-04,
2.6862e-04, 2.2104e-04, 3.2462e-04, 1.2849e-04, 2.0393e-04, 1.5256e-04,
2.6071e-04, 6.6147e-05, 6.7028e-05, 1.2931e-04, 1.4028e-04, 2.4826e-04,
2.5010e-04, 2.4390e-04, 1.4482e-04, 1.1784e-04, 1.3503e-04, 2.5993e-04,
2.7469e-04, 1.1944e-04, 1.9363e-04, 1.2827e-04, 1.3261e-04, 1.0641e-04,
1.8560e-04, 2.5891e-04, 3.0391e-04, 1.8834e-04, 1.8188e-04, 8.9142e-05],
device='cuda:0')}}, 'param_groups': [{'lr': 0.001, 'betas': (0.9, 0.999), 'eps': 1e-08, 'weight_decay': 0, 'amsgrad': False, 'params': [0, 1, 2, 3, 4, 5]}]}}
test_network(model=model2, data_type='test', device=device, criterion=nn.NLLLoss())
Runing on : cuda Test Accuracy: 0.800797 -----
Now you'll write a function to use a trained network for inference. That is, you'll pass an image into the network and predict the class of the flower in the image. Write a function called predict that takes an image and a model, then returns the top $K$ most likely classes along with the probabilities. It should look like
probs, classes = predict(image_path, model)
print(probs)
print(classes)
> [ 0.01558163 0.01541934 0.01452626 0.01443549 0.01407339]
> ['70', '3', '45', '62', '55']
First you'll need to handle processing the input image such that it can be used in your network.
You'll want to use PIL to load the image (documentation). It's best to write a function that preprocesses the image so it can be used as input for the model. This function should process the images in the same manner used for training.
First, resize the images where the shortest side is 256 pixels, keeping the aspect ratio. This can be done with the thumbnail or resize methods. Then you'll need to crop out the center 224x224 portion of the image.
Color channels of images are typically encoded as integers 0-255, but the model expected floats 0-1. You'll need to convert the values. It's easiest with a Numpy array, which you can get from a PIL image like so np_image = np.array(pil_image).
As before, the network expects the images to be normalized in a specific way. For the means, it's [0.485, 0.456, 0.406] and for the standard deviations [0.229, 0.224, 0.225]. You'll want to subtract the means from each color channel, then divide by the standard deviation.
And finally, PyTorch expects the color channel to be the first dimension but it's the third dimension in the PIL image and Numpy array. You can reorder dimensions using ndarray.transpose. The color channel needs to be first and retain the order of the other two dimensions.
from PIL import Image
def process_image(image, size=256, crop_size=224):
''' Scales, crops, and normalizes a PIL image for a PyTorch model,
returns an Numpy array
'''
# Ref: for image resizing with respect to ration https://gist.github.com/tomvon/ae288482869b495201a0
image = Image.open(image).convert("RGB")
mean, std = np.array([0.485, 0.456, 0.406]), np.array([0.229, 0.224, 0.225])
og_size = image.size
width, height = image.size
# Resize our image while keeping our aspect ration
width_percent = (size / float(width))
height = int((float(height) * float(width_percent)))
image = image.resize((size, height))
print(f'Image resized to: {image.size}, from: {og_size}')
# crop our image from the middle out
width, height = image.size
left = (width - crop_size) / 2
upper = (height - crop_size) /2
right = left + crop_size
lower = upper + crop_size
print(f'left: {left}, upper: {upper}, right: {right}, lower: {lower}')
image = image.crop((left, upper, right, lower))
#convert to float array in numpy
np_image = np.array(image) / 255
# subtract means from each color channel and divide by std deviation
np_image = (np_image - mean) / std
# finally, transpose the dimensions. PyTorch expects the oclor channel to be the first dimension
# buts its the third in the PIL image and numpy array.
np_image = np_image.transpose((2, 0, 1))
return np_image
t1 = process_image(data_dirs['valid'] + '/1/image_06758.jpg')
type(t1)
Image resized to: (256, 384), from: (500, 750) left: 16.0, upper: 80.0, right: 240.0, lower: 304.0
numpy.ndarray
To check your work, the function below converts a PyTorch tensor and displays it in the notebook. If your process_image function works, running the output through this function should return the original image (except for the cropped out portions).
def imshow(image, ax=None, title=None):
"""Imshow for Tensor."""
if ax is None:
fig, ax = plt.subplots()
# PyTorch tensors assume the color channel is the first dimension
# but matplotlib assumes is the third dimension
image = image.numpy().transpose((1, 2, 0))
# Undo preprocessing
mean = np.array([0.485, 0.456, 0.406])
std = np.array([0.229, 0.224, 0.225])
image = std * image + mean
# Image needs to be clipped between 0 and 1 or it looks like noise when displayed
image = np.clip(image, 0, 1)
ax.imshow(image)
return ax
# np_array is currently a numpy array, we need to convert to a tensor
np_array = process_image(data_dirs['valid'] + '/1/image_06758.jpg')
print(type(np_array))
t1_tensor = torch.from_numpy(np_array)
imshow(t1_tensor)
Image resized to: (256, 384), from: (500, 750) left: 16.0, upper: 80.0, right: 240.0, lower: 304.0 <class 'numpy.ndarray'>
<matplotlib.axes._subplots.AxesSubplot at 0x2c855247d30>
Once you can get images in the correct format, it's time to write a function for making predictions with your model. A common practice is to predict the top 5 or so (usually called top-$K$) most probable classes. You'll want to calculate the class probabilities then find the $K$ largest values.
To get the top $K$ largest values in a tensor use x.topk(k). This method returns both the highest k probabilities and the indices of those probabilities corresponding to the classes. You need to convert from these indices to the actual class labels using class_to_idx which hopefully you added to the model or from an ImageFolder you used to load the data (see here). Make sure to invert the dictionary so you get a mapping from index to class as well.
Again, this method should take a path to an image and a model checkpoint, then return the probabilities and classes.
probs, classes = predict(image_path, model)
print(probs)
print(classes)
> [ 0.01558163 0.01541934 0.01452626 0.01443549 0.01407339]
> ['70', '3', '45', '62', '55']
def predict(image_path, model, topk=5):
''' Predict the class (or classes) of an image using a trained deep learning model.
'''
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
# TODO: Implement the code to predict the class from an image file
model.eval()
#model.to(device=device)
# Load image, convert to nparray and then to a tensor
tensor = torch.from_numpy(process_image(image_path)).to(device, dtype=torch.float)
print(tensor.shape)
tensor = tensor.unsqueeze(0)
print(tensor.shape)
output = model.forward(tensor)
probabilities = torch.exp(output)
top_ps, top_classes = probabilities.data.topk(topk)
top_ps, top_classes = top_ps.cpu(), top_classes.cpu()
class_to_idx_inverse = {model.class_to_idx[i]: i for i in model.class_to_idx}
mapped_labels = []
for label in top_classes.numpy()[0]:
mapped_labels.append(class_to_idx_inverse[label])
return top_ps.numpy()[0], mapped_labels
acc, classes = predict(data_dirs['valid'] + '/71/image_04517.jpg', model)
print(f'Prediction Accuracies: {acc}')
print(f'Class Prediction: {classes}')
Image resized to: (256, 191), from: (667, 500) left: 16.0, upper: -16.5, right: 240.0, lower: 207.5 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224]) Prediction Accuracies: [9.9999952e-01 4.8456036e-07 5.1891771e-09 2.0960683e-10 2.7502405e-11] Class Prediction: ['71', '5', '63', '100', '8']
Now that you can use a trained model for predictions, check to make sure it makes sense. Even if the testing accuracy is high, it's always good to check that there aren't obvious bugs. Use matplotlib to plot the probabilities for the top 5 classes as a bar graph, along with the input image. It should look like this:

You can convert from the class integer encoding to actual flower names with the cat_to_name.json file (should have been loaded earlier in the notebook). To show a PyTorch tensor as an image, use the imshow function defined above.
image_paths = {
'71': data_dirs['test'] + '/71/image_04517.jpg',
'98': data_dirs['valid'] + '/98/image_07820.jpg',
'98.2': data_dirs['valid'] + '/98/image_07792.jpg',
'41': data_dirs['valid'] + '/41/image_02219.jpg',
'41.2': data_dirs['valid'] + '/41/image_02268.jpg',
'99': data_dirs['valid'] + '/99/image_07869.jpg',
'99.2': data_dirs['valid'] + '/99/image_08063.jpg',
'8': data_dirs['valid'] + '/8/image_03366.jpg',
'8.2': data_dirs['valid'] + '/8/image_03313.jpg',
}
for img_data in image_paths.items():
top_prob, top_classes = predict(img_data[1], model)
label = top_classes[0]
fig = plt.figure(figsize=(6,6))
sp_img = plt.subplot2grid((15,9), (0,0), colspan=9, rowspan=9)
sub_prob = plt.subplot2grid((15,9), (9,2), colspan=5, rowspan=5)
image = Image.open(img_data[1])
sp_img.axis('off')
sp_img.set_title(f'{cat_to_name[label]} ' + ' '.join(img_data[1].split('/')[2:]))
sp_img.imshow(image)
labels = []
for class_idx in top_classes:
labels.append(cat_to_name[class_idx])
yp = np.arange(5)
sub_prob.set_yticks(yp)
sub_prob.set_yticklabels(labels)
sub_prob.set_xlabel('Probability')
sub_prob.invert_yaxis()
sub_prob.barh(yp, top_prob, xerr=0,
align='center',
color='orange')
plt.show()
Image resized to: (256, 191), from: (667, 500) left: 16.0, upper: -16.5, right: 240.0, lower: 207.5 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 368), from: (500, 720) left: 16.0, upper: 72.0, right: 240.0, lower: 296.0 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 170), from: (753, 501) left: 16.0, upper: -27.0, right: 240.0, lower: 197.0 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 191), from: (667, 500) left: 16.0, upper: -16.5, right: 240.0, lower: 207.5 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 186), from: (687, 500) left: 16.0, upper: -19.0, right: 240.0, lower: 205.0 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 250), from: (512, 500) left: 16.0, upper: 13.0, right: 240.0, lower: 237.0 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 286), from: (500, 559) left: 16.0, upper: 31.0, right: 240.0, lower: 255.0 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 245), from: (522, 500) left: 16.0, upper: 10.5, right: 240.0, lower: 234.5 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
Image resized to: (256, 170), from: (750, 500) left: 16.0, upper: -27.0, right: 240.0, lower: 197.0 torch.Size([3, 224, 224]) torch.Size([1, 3, 224, 224])
cat_to_name['99']
'bromelia'